You can also use tail
with groupby to get the last n values of the group:
df.sort_values('date').groupby('id').tail(1)
id product date
2 220 6647 2014-10-16
8 901 4555 2014-11-01
5 826 3380 2015-05-19
More Related Contents:
- How can I pivot a dataframe?
- Get statistics for each group (such as count, mean, etc) using pandas GroupBy?
- How do I create a new column from the output of pandas groupby().sum()?
- GroupBy pandas DataFrame and select most common value
- Pandas GroupBy.apply method duplicates first group
- pandas GroupBy columns with NaN (missing) values
- Count unique values per groups with Pandas [duplicate]
- How to access pandas groupby dataframe by key
- Pandas GroupBy and select rows with the minimum value in a specific column
- why does pandas rolling use single dimension ndarray
- Group dataframe and get sum AND count?
- Sample each group after pandas groupby
- Pandas groupby with categories with redundant nan
- Is there an “ungroup by” operation opposite to .groupby in pandas?
- get first and last values in a groupby
- Pandas groupby and aggregation output should include all the original columns (including the ones not aggregated on)
- Get the row corresponding to the max in pandas GroupBy [duplicate]
- How to get number of groups in a groupby object in pandas?
- Apply multiple functions to multiple groupby columns
- Python Pandas : group by in group by and average?
- Pandas dataframe get first row of each group
- How to use groupby to concatenate strings in python pandas?
- Naming returned columns in Pandas aggregate function? [duplicate]
- Python Pandas Conditional Sum with Groupby
- How can I use cumsum within a group in Pandas?
- Identifying statistical outliers with pandas: groupby and reduce rows into different dataframe
- pandas group by and find first non null value for all columns
- Split pandas dataframe based on values in a column using groupby
- pandas divide row value by aggregated sum with a condition set by other cell
- Get frequency of item occurrences in a column as percentage [duplicate]