+ 1
How Do You Calculate Runtime Complexity?
Programmers use "Big O notation" to talk about time complexity. I understand it is about efficiency and time. How could I know the runtime complexity of my program and if my program is efficient? Does it depend just on the loops in the program or more? Please explain so that a total beginner could understand, or point to relevant resources.
4 Answers
+ 3
ChaoticDawg thank you for that, I'm gonna study itđ€đ€
+ 2
Relevant resource: https://en.wikipedia.org/wiki/Time_complexity?wprov=sfla1
Where I can read:
the time complexity is generally expressed as a function of the size of the input. Since this function is generally difficult to compute exactly, and the running time for small inputs is usually not consequential, one commonly focuses on the behavior of the complexity when the input size increasesâthat is, the asymptotic behavior of the complexity. Therefore, the time complexity is commonly expressed using big O notation, typically O(n), O(n\log n), O(n^{\alpha }), O(2^{n}),} etc., where n is the input size in units of bits needed to represent the input.
The asymptotic behaviour is the behaviour for n tending to infinity.
For example if n tends to infinite, n = n + 3 or even n = n + 9999999999. Because any constant is to small to make a difference for the infinity.
n and 2*n are are not the same, however they are on the same order of O(n). n^2 instead is completely on another level.
+ 2
Let's say you got a program that takes a integer N as input and print the sum of all the integers from 1 to N.
If you make a program that needs N iteration of a loop to calculate the sum, then your program have a completely of the order of O(N).
If your program needs N*5+20 iterations, it's order of complexity is still O(N).
If it needs N^2 iterations, then it's order is O(N^2)