What are some reasons you may choose a worse runtime algorithm? And what are some advantages of a linked list vs a hash table/array where access is constant time.
put on hold as too broad by rwong, Ixrec, Jörg W Mittag, gnat, amon yesterdayThere are either too many possible answers, or good answers would be too long for this format. Please add details to narrow the answer set or to isolate an issue that can be answered in a few paragraphs.If this question can be reworded to fit the rules in the help center, please edit the question. |
|||||||||||||||||
|
By far the most common reason is that the "worse" algorithm is a lot simpler, or is the first solution I think of, and getting mathematically optimal performance just doesn't matter in the part of the code I'm currently working on. Which is most parts of the code. Another common reason is that in general, theoretically faster algorithms often require making stronger assumptions about the input. Radix sort is way faster than quicksort/mergesort/heapsort/etc, but it only works if you assume minimum and maximum values for your input numbers. I don't want to hardcode arbitrary mins and maxs in my program without a very good reason, since that's exactly the sort of thing that makes programs brittle and unpredictable in the long run. So most of the time I'm not going to use radix sort unless the business requirements happen to give me a min and a max for free.
The biggest advantage of a linked list over an array or hash table is that inserting or removing an element anywhere in the list is a constant time operation, even in the non-amortized worst case (as long as you already have a reference to the node you want to insert/remove at). For arrays, insertion and removal are linear time operations. For hash tables, insertion and removal are amortized constant time, but linear time in general. In some cases linked lists (and trees) are also useful because pointers to their nodes do not get invalidated on future insertions. They only become invalid when that particular node is deleted. With arrays and hash tables, too many insertions will force a total reallocation which invalidates all pointers to existing elements. In many languages this is a complete non-issue, but in something like C or C++ it can occasionally be very important. |
|||
|