作者:Harkerbest
声明:本文章为原创文章,本文章永久链接:https://www.harkerbest.cn/p/1007, 转载请注明原文地址,盗版必究!!!
What we have learnt?
- Queue
- Stack
- Linked Lists
- Insertation Sort
- Merge Sort
There’s no doubt that we need to be capable of writing the code for the above data structures and algorithms by hand.
Then come to the next thing.
Algorithm Analysis
1. Introduction to Algorithms
- Definition: An algorithm is a set of clear, step-by-step instructions to solve a problem, taking inputs to produce outputs.
- Components: Algorithms manipulate data through structured methods and are the backbone of any program (Program = Algorithms + Data Structures).
2. Ways to Describe Algorithms
- Natural Language: Describes the steps in plain English.
- Pseudocode: Uses programming-like syntax but skips implementation details, bridging concepts to coding.
- Programming Code: Directly written in a programming language, executable by computers.
3. Why Analyze Algorithms?
- Efficiency Matters: A working program may still be inefficient. As data size grows, runtime becomes critical.
- Example: Various search algorithms demonstrate different levels of efficiency on sorted arrays.
4. Types of Algorithm Analysis
- Correctness: Only correct algorithms are analyzed, those that produce accurate results for all valid inputs.
- Resources Predicted: Memory, computational time, bandwidth, and power consumption.
5. Factors Influencing Computational Time
- Computer and Compiler: Hardware and software both impact runtime.
- Algorithm Design: The algorithm chosen affects performance.
- Input Size: Larger inputs typically increase the runtime.
6. Runtime Complexity Types
- Worst-Case: Maximum time for any input of size nnn.
- Best-Case: Minimum time for any input of size nnn.
- Average-Case: Expected time across various inputs; often complex to determine.
7. Calculating Algorithm Time Complexity
- Basic Operations: Time cost is calculated by counting fundamental operations (e.g., arithmetic, logical, assignment).
- Growth Rate: Determines how runtime changes with input size. Higher growth rates indicate slower algorithms for large inputs.
8. Big-Oh Notation (O)
- Definition: Upper bound on growth rate, ensuring an algorithm’s time won’t exceed a certain rate for large nnn.
- Example: f(n)=2n2f(n) = 2n^2f(n)=2n2 is O(n2)O(n^2)O(n2), meaning it grows no faster than n2n^2n2.
9. Other Asymptotic Notations
- Big-Omega (Ω): Lower bound on growth rate.
- Big-Theta (Θ): Tight bound, meaning the algorithm’s runtime is both O and Ω of a function.
10. Common Complexity Classes
- Constant (O(1)): Fixed time regardless of input size.
- Logarithmic (O(log n)): Time increases slowly as input grows (e.g., binary search).
- Linear (O(n)): Time grows directly with input size (e.g., simple loop).
- Quadratic (O(n^2)): Time grows with the square of input size (e.g., nested loops).
- Exponential (O(2^n)): Extremely fast-growing time, often impractical for large nnn.
11. Applying Big-Oh Rules
- Ignoring Constants: Lower-order terms and constants are ignored.
- Logarithm Bases: Base doesn’t impact Big-Oh analysis since it’s a constant factor.
12. L’Hôpital’s Rule and Growth Rate Comparison
- Rule Application: Helps compare growth rates of functions to determine Big-O, Big-Θ, and Big-Ω relationships.
13. Guidelines for Determining Complexity
- Loops: Complexity depends on iterations.
- Nested Loops: Product of inner and outer loop sizes.
- Consecutive Statements: Add complexities; use maximum for Big-O.
- Recursion: Solve recurrence relations to determine complexity.
Is it finish?
Someone told me that there was no question asked you the code in PPT last year…
And the Median score of last year’s midterm is 45/100.
So, why don’t we join a contest? Here we go!