You could use the heapq module:
>>> el = [20,67,3,2.6,7,74,2.8,90.8,52.8,4,3,2,5,7]
>>> import heapq
>>> heapq.nlargest(2, el)
[90.8, 74]
And go from there…
More Related Contents:
- Is doing multiplication multiple times or assigning to a variable faster?
- Python OpenCV streaming from camera – multithreading, timestamps
- pandas loc vs. iloc vs. at vs. iat?
- Are tuples more efficient than lists in Python?
- What is the best way to generate all possible three letter strings?
- Why is [] faster than list()?
- Why is numpy’s einsum faster than numpy’s built in functions?
- Efficient calculation of Fibonacci series
- Find out how much memory is being used by an object in Python [duplicate]
- Efficiently return the index of the first value satisfying condition in array
- Why is looping over range() in Python faster than using a while loop?
- How do I measure elapsed time in Python?
- Which is faster in Python: x**.5 or math.sqrt(x)?
- Numpy: Fix array with rows of different lengths by filling the empty elements with zeros
- Performance with global variables vs local
- Complexity of list.index(x) in Python
- Vectorizing or Speeding up Fuzzywuzzy String Matching on PANDAS Column
- for or while loop to do something n times [duplicate]
- Python import X or from X import Y? (performance)
- What is the fastest way to output large DataFrame into a CSV file?
- How expensive are Python dictionaries to handle?
- Getting days since last occurence in Pandas DataFrame?
- Why shouldn’t I use PyPy over CPython if PyPy is 6.3 times faster?
- Why is any (True for … if cond) much faster than any (cond for …)?
- Fast way to copy dictionary in Python
- What is the most efficient way to check if a value exists in a NumPy array?
- Is MATLAB faster than Python?
- Efficient dot products of large memory-mapped arrays
- Can I perform dynamic cumsum of rows in pandas?
- deque.popleft() and list.pop(0). Is there performance difference?