Strengths: Without the overhead of function calls or the utilization of stack memory, iteration can be used to repeatedly run a group of statements. The speed of recursion is slow. In data structure and algorithms, iteration and recursion are two fundamental problem-solving approaches. Since this is the first value of the list, it would be found in the first iteration. In this tutorial, we’ll talk about two search algorithms: Depth-First Search and Iterative Deepening. Difference in terms of code a nalysis In general, the analysis of iterative code is relatively simple as it involves counting the number of loop iterations and multiplying that by the. This means that a tail-recursive call can be optimized the same way as a tail-call. However, for some recursive algorithms, this may compromise the algorithm’s time complexity and result in a more complex code. Oct 9, 2016 at 21:34. Contrarily, iterative time complexity can be found by identifying the number of repeated cycles in a loop. But at times can lead to difficult to understand algorithms which can be easily done via recursion. In this case, iteration may be way more efficient. Complexity: Can have a fixed or variable time complexity depending on the loop structure. Firstly, our assignments of F[0] and F[1] cost O(1) each. Iteration will be faster than recursion because recursion has to deal with the recursive call stack frame. How many nodes are there. In order to build a correct benchmark you must - either chose a case where recursive and iterative versions have the same time complexity (say linear). Its time complexity is fairly easier to calculate by calculating the number of times the loop body gets executed. But recursion on the other hand, in some situations, offers convenient tool than iterations. In this tutorial, we’ll introduce this algorithm and focus on implementing it in both the recursive and non-recursive ways. Moving on to slicing, although binary search is one of the rare cases where recursion is acceptable, slices are absolutely not appropriate here. Here’s a graph plotting the recursive approach’s time complexity, , against the dynamic programming approaches’ time complexity, : 5. The first is to find the maximum number in a set. Recursion $&06,*$&71HZV 0DUFK YRO QR For any problem, if there is a way to represent it sequentially or linearly, we can usually use. The time complexity for the recursive solution will also be O(N) as the recurrence is T(N) = T(N-1) + O(1), assuming that multiplication takes constant time. This reading examines recursion more closely by comparing and contrasting it with iteration. Recursion vs Iteration: You can reduce time complexity of program with Recursion. Iteration reduces the processor’s operating time. So whenever the number of steps is limited to a small. If your algorithm is recursive with b recursive calls per level and has L levels, the algorithm has roughly O (b^L ) complexity. So let us discuss briefly on time complexity and the behavior of Recursive v/s Iterative functions. Iteration vs. Suppose we have a recursive function over integers: let rec f_r n = if n = 0 then i else op n (f_r (n - 1)) Here, the r in f_r is meant to. The towers of Hanoi problem is hard no matter what algorithm is used, because its complexity is exponential. 3. To visualize the execution of a recursive function, it is. Here are some ways to find the book from. Additionally, I'm curious if there are any advantages to using recursion over an iterative approach in scenarios like this. Both involve executing instructions repeatedly until the task is finished. The primary difference between recursion and iteration is that recursion is a process, always. We can optimize the above function by computing the solution of the subproblem once only. Recursion is a repetitive process in which a function calls itself. But when you do it iteratively, you do not have such overhead. And I have found the run time complexity for the code is O(n). And here the for loop takes n/2 since we're increasing by 2, and the recursion takes n/5 and since the for loop is called recursively, therefore, the time complexity is in (n/5) * (n/2) = n^2/10, due to Asymptotic behavior and worst-case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest. , it runs in O(n). , current = current->right Else a) Find. In this video, we cover the quick sort algorithm. However, when I try to run them over files with 50 MB, it seems like that the recursive-DFS (9 secs) is much faster than that using an iterative approach (at least several minutes). Computations using a matrix of size m*n have a space complexity of O (m*n). Control - Recursive call (i. 1. Note: To prevent integer overflow we use M=L+(H-L)/2, formula to calculate the middle element, instead M=(H+L)/2. This study compares differences in students' ability to comprehend recursive and iterative programs by replicating a 1996 study, and finds a recursive version of a linked list search function easier to comprehend than an iterative version. As a thumbrule: Recursion is easy to understand for humans. It talks about linear recursive processes, iterative recursive processes (like the efficient recursive fibr), and tree recursion (the naive inefficient fib uses tree recursion). The Iteration method would be the prefer and faster approach to solving our problem because we are storing the first two of our Fibonacci numbers in two variables (previouspreviousNumber, previousNumber) and using "CurrentNumber" to store our Fibonacci number. 1. We would like to show you a description here but the site won’t allow us. 12. So whenever the number of steps is limited to a small. io. Time Complexity: O(n), a vast improvement over the exponential time complexity of recursion. Yes. In that sense, it's a matter of how a language processes the code also, as I've mentioned, some compilers transformers a recursion into a loop on its binary depending on its computation on that code. Comparing the above two approaches, time complexity of iterative approach is O(n) whereas that of recursive approach is O(2^n). Introduction. Let’s take an example to explain the time complexity. Should one solution be recursive and other iterative, the time complexity should be the same, if of course this is the same algorithm implemented twice - once recursively and once iteratively. It is fast as compared to recursion. 2. With constant-time arithmetic, thePhoto by Mario Mesaglio on Unsplash. Here, the iterative solution. Insertion sort is a stable, in-place sorting algorithm that builds the final sorted array one item at a time. The recursive version uses the call stack while the iterative version performs exactly the same steps, but uses a user-defined stack instead of the call stack. In this traversal, we first create links to Inorder successor and print the data using these links, and finally revert the changes to restore original tree. Recursion adds clarity and. Recursion terminates when the base case is met. It is a technique or procedure in computational mathematics used to solve a recurrence relation that uses an initial guess to generate a sequence of improving approximate solutions for a class of. Recursion and iteration are equally expressive: recursion can be replaced by iteration with an explicit call stack, while iteration can be replaced with tail recursion. Initialize current as root 2. To know this we need to know the pros and cons of both these ways. The total time complexity is then O(M(lgmax(m1))). As shown in the algorithm we set the f[1], f[2] f [ 1], f [ 2] to 1 1. Recursion • Rules" for Writing Recursive Functions • Lots of Examples!. 1. Only memory for the. Recursion: The time complexity of recursion can be found by finding the value of the nth recursive call in terms of the previous calls. Recursion produces repeated computation by calling the same function recursively, on a simpler or smaller subproblem. Iterative codes often have polynomial time complexity and are simpler to optimize. Step1: In a loop, calculate the value of “pos” using the probe position formula. Usage: Recursion is generally used where there is no issue of time complexity, and code size requires being small. Though average and worst-case time complexity of both recursive and iterative quicksorts are O(N log N) average case and O(n^2). Iteration The original Lisp language was truly a functional language:. 0. I'm a little confused. Thus, the time complexity of factorial using recursion is O(N). If the limiting criteria are not met, a while loop or a recursive function will never converge and lead to a break in program execution. Where I have assumed that k -> infinity (in my book they often stop the reccurence when the input in T gets 1, but I don't think this is the case,. A recursive implementation requires, in the worst case, a number of stack frames (invocations of subroutines that have not finished running yet) proportional to the number of vertices in the graph. 2. Iteration and recursion are normally interchangeable, but which one is better? It DEPENDS on the specific problem we are trying to solve. For example, the Tower of Hanoi problem is more easily solved using recursion as. iteration. 3. Time Complexity. 1 Answer. Share. Recursion takes additional stack space — We know that recursion takes extra memory stack space for each recursive calls, thus potentially having larger space complexity vs. Including the theory, code implementation using recursion, space and time complexity analysis, along with c. We can define factorial in two different ways: 5. It is an essential concept in computer science and is widely used in various algorithms, including searching, sorting, and traversing data structures. Practice. Recursion can be slow. 1. Reduced problem complexity Recursion solves complex problems by. In the recursive implementation on the right, the base case is n = 0, where we compute and return the result immediately: 0! is defined to be 1. This can include both arithmetic operations and data. Performs better in solving problems based on tree structures. The objective of the puzzle is to move the entire stack to another rod, obeying the following simple rules: 1) Only one disk can be moved at a time. n in this example is the quantity of Person s in personList. Loops do not. Time complexity = O(n*m), Space complexity = O(1). Recursion: Recursion has the overhead of repeated function calls, that is due to the repetitive calling of the same function, the time complexity of the code increases manyfold. "tail recursion" and "accumulator based recursion" are not mutually exclusive. Consider for example insert into binary search tree. First, one must observe that this function finds the smallest element in mylist between first and last. The recursive version can blow the stack in most language if the depth times the frame size is larger than the stack space. Sometimes the rewrite is quite simple and straight-forward. g. 4. Efficiency---The Time Complexity of an Algorithm In the bubble sort algorithm, there are two kinds of tasks. And, as you can see, every node has 2 children. Storing these values prevent us from constantly using memory space in the. So let us discuss briefly on time complexity and the behavior of Recursive v/s Iterative functions. CIS2500 Graded Lab 3: Recursion vs Iteration Objective Evaluate the strengths and weaknesses of recursive algorithms in relation to the time taken to complete the program, and compare them to their iterative counterparts. In contrast, the iterative function runs in the same frame. There are possible exceptions such as tail recursion optimization. It is the time needed for the completion of an algorithm. This is the recursive method. Reduces time complexity. Iteration terminates when the condition in the loop fails. Recursion $&06,*$&71HZV 0DUFK YRO QR For any problem, if there is a way to represent it sequentially or linearly, we can usually use. Backtracking. Iteration is quick in comparison to recursion. linear, while the second implementation is shorter but has exponential complexity O(fib(n)) = O(φ^n) (φ = (1+√5)/2) and thus is much slower. Auxiliary Space: DP may have higher space complexity due to the need to store results in a table. Thus the amount of time. Memoization¶. Remember that every recursive method must have a base case (rule #1). Hence, usage of recursion is advantageous in shorter code, but higher time complexity. High time complexity. The actual complexity depends on what actions are done per level and whether pruning is possible. For example, using a dict in Python (which has (amortized) O (1) insert/update/delete times), using memoization will have the same order ( O (n)) for calculating a factorial as the basic iterative solution. Readability: Straightforward and easier to understand for most programmers. Similarly, Space complexity of an algorithm quantifies the amount of space or memory taken by an algorithm to run as a function of the length of the input. The speed of recursion is slow. Naive sorts like Bubble Sort and Insertion Sort are inefficient and hence we use more efficient algorithms such as Quicksort and Merge Sort. Performs better in solving problems based on tree structures. Analysis of the recursive Fibonacci program: We know that the recursive equation for Fibonacci is = + +. In terms of time complexity and memory constraints, iteration is preferred over recursion. With this article at OpenGenus, you must have a strong idea of Iteration Method to find Time Complexity of different algorithms. The recursive function runs much faster than the iterative one. At this time, the complexity of binary search will be k = log2N. But when I compared time of solution for two cases recursive and iteration I had different results. Its time complexity anal-ysis is similar to that of num pow iter. It may vary for another example. However, if time complexity is not an issue and shortness of code is, recursion would be the way to go. Recursion vs Iteration is one of those age-old programming holy wars that divides the dev community almost as much as Vim/Emacs, Tabs/Spaces or Mac/Windows. The time complexity in iteration is. Time Complexity: O(n) Auxiliary Space: O(n) An Optimized Divide and Conquer Solution: To solve the problem follow the below idea: There is a problem with the above solution, the same subproblem is computed twice for each recursive call. If I do recursive traversal of a binary tree of N nodes, it will occupy N spaces in execution stack. In 1st version you can replace the recursive call of factorial with simple iteration. What we lose in readability, we gain in performance. We'll explore what they are, how they work, and why they are crucial tools in problem-solving and algorithm development. It's an optimization that can be made if the recursive call is the very last thing in the function. In this Video, we are going to learn about Time and Space Complexities of Recursive Algo. Iteration is faster than recursion due to less memory usage. Utilization of Stack. Both iteration and recursion are. See moreEven though the recursive approach is traversing the huge array three times and, on top of that, every time it removes an element (which takes O(n) time as all other 999 elements. The iterative version uses a queue to maintain the current nodes, while the recursive version may use any structure to persist the nodes. 2. Can be more complex and harder to understand, especially for beginners. org. Time complexity is relatively on the lower side. Iteration is the process of repeatedly executing a set of instructions until the condition controlling the loop becomes false. So the best case complexity is O(1) Worst Case: In the worst case, the key might be present at the last index i. personally, I find it much harder to debug typical "procedural" code, there is a lot of book keeping going on as the evolution of all the variables has to be kept in mind. Time complexity is very high. In the worst case scenario, we will only be left with one element on one far side of the array. However, the iterative solution will not produce correct permutations for any number apart from 3 . 5: We mostly prefer recursion when there is no concern about time complexity and the size of code is small. Time Complexity. If you want actual compute time, use your system's timing facility and run large test cases. Can have a fixed or variable time complexity depending on the number of recursive calls. Our iterative technique has an O(N) time complexity due to the loop's O(N) iterations (N). The Java library represents the file system using java. The recursive step is n > 0, where we compute the result with the help of a recursive call to obtain (n-1)!, then complete the computation by multiplying by n. The letter “n” here represents the input size, and the function “g (n) = n²” inside the “O ()” gives us. Recursive calls don't cause memory "leakage" as such. 4. Recursion — depending on the language — is likely to use the stack (note: you say "creates a stack internally", but really, it uses the stack that programs in such languages always have), whereas a manual stack structure would require dynamic memory allocation. The first method calls itself recursively once, therefore the complexity is O(n). Share. Recursion is the most intuitive but also the least efficient in terms of time complexity and space complexity. The first code is much longer but its complexity is O(n) i. Time Complexity of Binary Search. Memory Utilization. We have discussed iterative program to generate all subarrays. A time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. For every iteration of m, we have n. For integers, Radix Sort is faster than Quicksort. However, there is a issue of recalculation of overlapping sub problems in the 2nd solution. Yes, those functions both have O (n) computational complexity, where n is the number passed to the initial function call. Iteration is preferred for loops, while recursion is used for functions. Finding the time complexity of Recursion is more complex than that of Iteration. Unlike in the recursive method, the time complexity of this code is linear and takes much less time to compute the solution, as the loop runs from 2 to n, i. Many compilers optimize to change a recursive call to a tail recursive or an iterative call. 🔁 RecursionThe time complexity is O (2 𝑛 ), because that is the number of iterations done in the only loops present in the code, while all other code runs in constant time. Transforming recursion into iteration eliminates the use of stack frames during program execution. We can see that return mylist[first] happens exactly once for each element of the input array, so happens exactly N times overall. ) Every recursive algorithm can be converted into an iterative algorithm that simulates a stack on which recursive function calls are executed. So for practical purposes you should use iterative approach. Introduction. Big O Notation of Time vs. Recursive. Recurson vs Non-Recursion. In the above algorithm, if n is less or equal to 1, we return nor make two recursive calls to calculate fib of n-1 and fib of n-2. The definition of a recursive function is a function that calls itself. 1. Explanation: Since ‘mid’ is calculated for every iteration or recursion, we are diving the array into half and then try to solve the problem. This is the essence of recursion – solving a larger problem by breaking it down into smaller instances of the. When we analyze the time complexity of programs, we assume that each simple operation takes. It breaks down problems into sub-problems which it further fragments into even more sub. Focusing on space complexity, the iterative approach is more efficient since we are allocating a constant amount O(1) of space for the function call and. If i use iteration , i will have to use N spaces in an explicit stack. Because of this, factorial utilizing recursion has. e. Both recursion and iteration run a chunk of code until a stopping condition is reached. And Iterative approach is always better than recursive approch in terms of performance. , referring in part to the function itself. For example, MergeSort - it splits the array into two halves and calls itself on these two halves. Recursion Every recursive function can also be written iteratively. In our recursive technique, each call consumes O(1) operations, and there are O(N) recursive calls overall. Analyzing the time complexity for our iterative algorithm is a lot more straightforward than its recursive counterpart. Clearly this means the time Complexity is O(N). The Iteration method would be the prefer and faster approach to solving our problem because we are storing the first two of our Fibonacci numbers in two variables (previouspreviousNumber, previousNumber) and using "CurrentNumber" to store our Fibonacci number. But, if recursion is written in a language which optimises the. Any recursive solution can be implemented as an iterative solution with a stack. geeksforgeeks. Related question: Recursion vs. It is faster than recursion. As for the recursive solution, the time complexity is the number of nodes in the recursive call tree. Iteration is the repetition of a block of code using control variables or a stopping criterion, typically in the form of for, while or do-while loop constructs. The basic idea of recursion analysis is: Calculate the total number of operations performed by recursion at each recursive call and do the sum to get the overall time complexity. Before going to know about Recursion vs Iteration, their uses and difference, it is very important to know what they are and their role in a program and machine languages. If it's true that recursion is always more costly than iteration, and that it can always be replaced with an iterative algorithm (in languages that allow it) - than I think that the two remaining reasons to use. Even now, if you are getting hard time to understand the logic, i would suggest you to make a tree-like (not the graph which i have shown here) representation for xstr = "ABC" and ystr. Time complexity. Recursion requires more memory (to set up stack frames) and time (for the same). Consider writing a function to compute factorial. Backtracking at every step eliminates those choices that cannot give us the. To understand what Big O notation is, we can take a look at a typical example, O (n²), which is usually pronounced “Big O squared”. Recursion will use more stack space assuming you have a few items to transverse. The order in which the recursive factorial functions are calculated becomes: 1*2*3*4*5. Given an array arr = {5,6,77,88,99} and key = 88; How many iterations are. 3. For medium to large. Processes generally need a lot more heap space than stack space. perf_counter() and end_time to see the time they took to complete. This also includes the constant time to perform the previous addition. pop() if node. The Java library represents the file system using java. "Recursive is slower then iterative" - the rational behind this statement is because of the overhead of the recursive stack (saving and restoring the environment between calls). This worst-case bound is reached on, e. Recursion involves creating and destroying stack frames, which has high costs. Iteration uses the CPU cycles again and again when an infinite loop occurs. mat mul(m1,m2)in Fig. In Java, there is one situation where a recursive solution is better than a. Proof: Suppose, a and b are two integers such that a >b then according to. Tail-recursion is the intersection of a tail-call and a recursive call: it is a recursive call that also is in tail position, or a tail-call that also is a recursive call. There are often times that recursion is cleaner, easier to understand/read, and just downright better. However, if you can set up tail recursion, the compiler will almost certainly compile it into iteration, or into something which is similar, giving you the readability advantage of recursion, with the performance. When you have a single loop within your algorithm, it is linear time complexity (O(n)). Evaluate the time complexity on the paper in terms of O(something). Time Complexity: In the above code “Hello World” is printed only once on the screen. Quoting from the linked post: Because you can build a Turing complete language using strictly iterative structures and a Turning complete language using only recursive structures, then the two are therefore equivalent. The problem is converted into a series of steps that are finished one at a time, one after another. The inverse transformation can be trickier, but most trivial is just passing the state down through the call chain. The second return (ie: return min(. The first function executes the ( O (1) complexity) statements in the while loop for every value between a larger n and 2, for an overall complexity of O (n). Recursion tree and substitution method. The reason that loops are faster than recursion is easy. Time Complexity Analysis. Analysis. A single point of comparison has a bias towards the use-case of recursion and iteration, in this case; Iteration is much faster. The difference between O(n) and O(2 n) is gigantic, which makes the second method way slower. The iteration is when a loop repeatedly executes until the controlling condition becomes false. Iterative and recursive both have same time complexity. Obviously, the time and space complexity of both. N logarithm N (N * log N) N*logN complexity refers to product of N and log of N to the base 2. Space The Fibonacci sequence is de ned: Fib n = 8 >< >: 1 n == 0Efficiency---The Time Complexity of an Algorithm In the bubble sort algorithm, there are two kinds of tasks. I am studying Dynamic Programming using both iterative and recursive functions. Its time complexity is easier to calculate by calculating the number of times the loop body gets executed. There is less memory required in the case of. For large or deep structures, iteration may be better to avoid stack overflow or performance issues. e. Any function that is computable – and many are not – can be computed in an infinite number. Only memory for the. Here are the general steps to analyze loops for complexity analysis: Determine the number of iterations of the loop. If you get the time complexity, it would be something like this: Line 2-3: 2 operations. This is the main part of all memoization algorithms. Application of Recursion: Finding the Fibonacci sequenceThe master theorem is a recipe that gives asymptotic estimates for a class of recurrence relations that often show up when analyzing recursive algorithms. Recursion is a separate idea from a type of search like binary. Both recursion and ‘while’ loops in iteration may result in the dangerous infinite calls situation. It causes a stack overflow because the amount of stack space allocated to each process is limited and far lesser than the amount of heap space allocated to it. 2 and goes over both solutions! –Any loop can be expressed as a pure tail recursive function, but it can get very hairy working out what state to pass to the recursive call. Iterative functions explicitly manage memory allocation for partial results. So the worst-case complexity is O(N). 12. Time Complexity: O(2 n) Auxiliary Space: O(n) Here is the recursive tree for input 5 which shows a clear picture of how a big problem can be solved into smaller ones. Thus the runtime and space complexity of this algorithm in O(n). In C, recursion is used to solve a complex problem. By breaking down a. functions are defined by recursion, so implementing the exact definition by recursion yields a program that is correct "by defintion". Determine the number of operations performed in each iteration of the loop. For some examples, see C++ Seasoning for the imperative case. The 1st one uses recursive calls to calculate the power(M, n), while the 2nd function uses iterative approach for power(M, n). With regard to time complexity, recursive and iterative methods both will give you O(log n) time complexity, with regard to input size, provided you implement correct binary search logic. Let’s start using Iteration. Let’s take an example of a program below which converts integers to binary and displays them. The debate around recursive vs iterative code is endless. To visualize the execution of a recursive function, it is. Its time complexity is fairly easier to calculate by calculating the number of times the loop body gets executed. Iteration: Iteration is repetition of a block of code. g. Sum up the cost of all the levels in the. I found an answer here but it was not clear enough. I just use a normal start_time = time. Imagine a street of 20 book stores. GHC Recursion is quite slower than iteration. There is more memory required in the case of recursion. A single point of comparison has a bias towards the use-case of recursion and iteration, in this case; Iteration is much faster. Time Complexity: Time complexity of the above implementation of Shell sort is O(n 2). 1. The auxiliary space required by the program is O(1) for iterative implementation and O(log 2 n) for. At any given time, there's only one copy of the input, so space complexity is O(N). For the times bisect doesn't fit your needs, writing your algorithm iteratively is arguably no less intuitive than recursion (and, I'd argue, fits more naturally into the Python iteration-first paradigm). 2. Because each call of the function creates two more calls, the time complexity is O(2^n); even if we don’t store any value, the call stack makes the space complexity O(n). So a filesystem is recursive: folders contain other folders which contain other folders, until finally at the bottom of the recursion are plain (non-folder) files. In the worst case (starting in the middle and extending out all the way to the end, this results in calling the method n/2 times, which is the time complexity class O (n). Iteration is generally faster, some compilers will actually convert certain recursion code into iteration. Time Complexity. Strictly speaking, recursion and iteration are both equally powerful. When you're k levels deep, you've got k lots of stack frame, so the space complexity ends up being proportional to the depth you have to search. Iteration. Yes, recursion can always substitute iteration, this has been discussed before. Reduces time complexity. Recursion takes. ago. The complexity of this code is O(n). )Time complexity is very useful measure in algorithm analysis. Therefore Iteration is more efficient. You can find a more complete explanation about the time complexity of the recursive Fibonacci. Recursion is more natural in a functional style, iteration is more natural in an imperative style. Recursion has a large amount of Overhead as compared to Iteration. e. In this case, our most costly operation is assignment. Graph Search. Iteration; For more content, explore our free DSA course and coding interview blogs. The time complexity of the given program can depend on the function call. There is an edge case, called tail recursion. But at times can lead to difficult to understand algorithms which can be easily done via recursion. Sum up the cost of all the levels in the. If a new operation or iteration is needed every time n increases by one, then the algorithm will run in O(n) time. Tail recursion optimization essentially eliminates any noticeable difference because it turns the whole call sequence to a jump. So does recursive BFS. remembering the return values of the function you have already. Time Complexity: It has high time complexity. Here N is the size of data structure (array) to be sorted and log N is the average number of comparisons needed to place a value at its right. Here, the iterative solution uses O (1. Here are some scenarios where using loops might be a more suitable choice: Performance Concerns : Loops are generally more efficient than recursion regarding time and space complexity. 1. Improve this answer. The puzzle starts with the disk in a neat stack in ascending order of size in one pole, the smallest at the top thus making a conical shape. Sorted by: 4. E. The complexity analysis does not change with respect to the recursive version. A recursive function is one that calls itself, such as the printList function which uses the divide and conquer principle to print the numbers 1 to 5. When recursion reaches its end all those frames will start. Time and space complexity depends on lots of things like hardware, operating system, processors, etc. Observe that the computer performs iteration to implement your recursive program. Case 2: This case is pretty simple here you have n iteration inside the for loop so time complexity is n.