Step 1: Understand what time complexity means. It measures how the time to complete an algorithm grows as the size of the input data increases.
Step 2: Know that quicksort is a sorting algorithm that works by dividing the data into smaller parts (called partitions).
Step 3: Realize that in the average case, quicksort divides the data into two roughly equal parts each time it sorts.
Step 4: Understand that each division takes time proportional to the number of items being sorted, which is 'n'.
Step 5: Recognize that the number of times quicksort can divide the data is proportional to the logarithm of the number of items, which is 'log n'.
Step 6: Combine these two insights: the average time complexity is the product of the number of divisions and the time taken for each division, which is 'n log n'.
Step 7: Conclude that the average time complexity of quicksort is O(n log n), indicating it is efficient for large datasets.