Remember
Register
Algorithms Q&A
Nobel Prize in Economics
Algorithms Textbook
Q&A
Questions
Unanswered
Ask a Question
AI Teams
Lecture Notes
Categories
All categories
Math Basics
(5)
Asymptotic Analysis
(38)
Divide & Conquer
(18)
Greedy Algorithms
(10)
Dynamic Programming
(19)
Backtracking/DFS/BFS
(2)
Branch & Bound
(6)
Graph Theory
(11)
NP-Completeness
(8)
Artificial Intelligence
(28)
Randomized Algorithms
(1)
Most popular tags
asymptotic-analysis
recurrence-relations
time-complexity
loops
asymptotic-notation
graph
dynamic-programming
greedy
substitution-method
analysis
a-star
np-completeness
nested-loops
vertex-coloring
mdp
log
probability
stochastic
heuristic
master-theorem
markov-model
n-puzzle
csp
graph-coloring
exam
mvcs
small-oh
exponent
proof
viterbi
bayes-rule
hmm
tree-search
grid-world
admissible
n-queens
conflict
ai
clique
coins
reduction
dfs
prime-numbers
sqrt
count
easy
sorted-lists
logn
example
recursive
gcd
independent-set
unsolvable
pcp
counter-example
not-master-theorem
modulus
algebra
most-likely-estimate
reinforcement-learning
direct-evaluation
meu
articulation-point
hotel-room
small-omega
limit-method
mle
graph-search
while-loop
greedy-suboptimal
job-assignment
maximize-value
gold
constraint-satisfaction-problem
8-puzzle
task-environments
min-max
peak
randomized
satisfiability
random-graph-generation
proxy
network
sudoku
branchandbound
d&c
degree-constrained
spanning-tree
vertex-cover
branch
subtree
series
pmi
bound
contradiction
math
backtracking
tree
minimize
floors
log(n!) = Theta(nlogn)
+3
votes
How to explain this equation? It seems to be clear that log(n!) = O(n log n), since log (1.2.3.4.5....n) <= log (n.n.n.n...n) (n times).
However, how do we prove that log (n!) = Omega (n log n)
asymptotic-notation
asked
May 9, 2017
in
Asymptotic Analysis
by
shakexin
(
180
points)
edited
Jun 3, 2017
by
Amrinder Arora
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
1
Answer
+2
votes
Best answer
log(n!) = log(1) + log(2) + ... + log(n-1) + log(n)
You can get the upper bound by
log(1) + log(2) + ... + log(n) <= log(n) + log(n) + ... + log(n)
= n*log(n)
And you can get the lower bound by doing a similar thing after throwing away the first half of the sum:
log(1) + ... + log(n/2) + ... + log(n) >= log(n/2) + ... + log(n)
>= log(n/2) + ... + log(n/2)
= n/2 * log(n/2)
answered
Jun 15, 2017
by
Chris
AlgoStar
(
420
points)
selected
Jul 18, 2017
by
Amrinder Arora
Please
log in
or
register
to add a comment.
Related questions
+1
vote
2
answers
Given f(n) = o(g(n)), show that it is not necessary that log (f(n)) = o(log (g(n)))
asked
Sep 12, 2016
in
Asymptotic Analysis
by
Amrinder Arora
AlgoMeister
(
1.6k
points)
asymptotic-notation
small-oh
log
+1
vote
1
answer
Time Complexity Analysis - 2 (2nd by log n)
asked
Jan 19, 2016
in
Asymptotic Analysis
by
anonymous
analysis
loops
asymptotic-notation
+1
vote
5
answers
Given f(n) = o(g(n)), prove that 2^f(n) = o(2^g(n))
asked
Sep 12, 2016
in
Asymptotic Analysis
by
Amrinder Arora
AlgoMeister
(
1.6k
points)
exponent
log
small-oh
asymptotic-notation
0
votes
4
answers
Compare 2^n^2 and 10^n asymptotically
asked
Sep 10, 2016
in
Asymptotic Analysis
by
Amrinder Arora
AlgoMeister
(
1.6k
points)
asymptotic-notation
exponent
0
votes
1
answer
Analyze Double Loop - j starts from i*i
asked
Nov 22
in
Asymptotic Analysis
by
Amrinder Arora
AlgoMeister
(
1.6k
points)
loops
asymptotic-notation
analysis
asymptotic-analysis
The Book: Analysis and Design of Algorithms
|
Presentations on Slideshare
|
Lecture Notes, etc
...