» » Learn Data Structures and Algorithms

Learn Data Structures and Algorithms

Learn Data Structures and Algorithms
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 14M | Lec: 10 | 27 MB
Genre: eLearning | Language: English

Master all concepts of Data Structures and Algorithms
Data Structures are the programmatic way of storing data so that data can be used efficiently. Almost every enterprise application uses various types of data structures and algorithms in one or the other way. This course will give you a great understanding on Data Structures and Algorithms needed to understand the complexity of enterprise level applications and need of algorithms, and data structures.
An algorithm is nothing more than a defined series of steps to achieve a goal. For example adding a value to the value in a memory location works something like this:
load existing value from memory into register A
load new value into register B
add register A and register B, with result in register A
write register A to memory
It's simple, it's short and it's trivial, but it is an algorithm, one for adding a value to a memory location. If you need to do this sort of thing, it's handy to know how.
More complex algorithms, of course, are also more interesting. Binary search, Quicksort, Boyer-Moore string searching. Algorithms are important because they are how we actually get things done. Now,as to why it's important to learn about them in detail, such as the performance characteristics of a quicksort versus a merge sort, well, that comes down to time and cost and effectiveness.
Let's see how this applies by examining two sorting algorithms and how they compare when used in different sized data sets. We'll examine bubble sort and quicksort.
If you don't know "Big-Oh" notation, it's an indicator of "efficiency". The variations you're likely to run across most are:
O(1) - runs in constant time, regardless of size of input
O(n) - runs in time proportional to size of input (100 items takes twice as long as 50 items)
O(n log n) - runs in time proportional to n times the logarithm of n (this is common for sorting algorithms)
O(n^2) runs in time proportional to the square of the number of items
There are others, but leave it at those for now. So, suppose we had 100 items. An O(n) algorithm would require 100 "steps" to do its work, whereas an O(n^2) algorithm would require 100^2, or about 10,000 steps.
So let's take Bubble sort as a starting point. Bubble sort is an O(n^2) algorithm. If you want to sort 100 items, it will require 10,000 "steps". By contrast, Quicksort is an O(n log n) algorithm, meaning if you wanted to sort 100 items, it would require some (100 * log(100)) or about 200 steps. (I believe the log is normally evaluated in base 2,
meaning the value of log(100) would be more like 6.5, but it's easier to work with log 10 bases for the moment and doesn't alter the results all that much.)
Okay, so for 100 elements, Bubble needs 10,000 steps, Quicksort needs about 200. Big whoop, with computers capable of doing umpteen million operations per second, who cares, right?
Well, consider what happens as the number if items, n, goes up.
With 1000 elements, BubbleSort (BS) needs some 1,000,000 steps, Quicksort (QS) needs 3000.
With 10,000 elements, BS needs 100,000,000 steps, QS needs 40,000.
With 1,000,000 elements, BS needs 1,000,000,000,000 steps, QS needs 6,000,000.
With a million elements, BubbleSort needs 166,667 times more operations to achieve the same goal. If you can perform a million operations per second, sorting a million elements with BubbleSort would require a million seconds, doing the same job with QuickSort takes 6 seconds.
There are, of course, a few caveats there. An "operation" or "step" may take longer to perform when doing a QuickSort than when doing a Bubblesort, for example. However, unless it takes more than 167,000 times longer,
QuickSort wins, and in reality it wins by a huge margin, and the margin gets bigger and bigger as you sort more elements.
By understanding algorithms, we not only can do the job, we can determine whether the one algorithm is an effective way to do the job or not.
You can extend this further, of course. For example, merge sort averages slower than QuickSort as a rule, but there are cases where QuickSort's behaviour can become as bad as that of Bubble Sort, so you might want
to consider merge. And, of course, if you're sorting things on disk, where it is very, very slow to move data around needlessly, there are sorts which specifically avoid moving anything which is already in the "proper place".
And, of course, some sorts require more memory than others, and some are better suited to certain types of data and so on and so on and so on.
The proper choice of just a sorting algorithm, when applied to a large data set, such as a significant database, must take into consideration many factors about the performance of the algorithm and the nature of where and how the data is accessed and can be the difference between a bit of code that runs overnight or over a weekend and one which takes potentially several million years.
This is why algorithms are important; they let us do things, and by understanding how they work and their limits and benefits, they let us do things as cheaply and effectively and efficiently as possible.
Let's get started with the Algorithms course!


Download link :
(Buy Long-term Premium Accounts To Support Me & Max Speed)
Links are Interchangeable - Single Extraction - Premium is support resumable
Related Articles:

Comments (0)

Currently there is no comment for Learn Data Structures and Algorithms
be the first to add comment! Please register or Login