Issue
I have the following list of dictionaries
data=[
{'Time': 18057610.0, 'String_8': -1.4209e-15},
{'Time': 18057610.0, 'String_9': 2.7353e-16},
{'Time': 18057610.0, 'String_10': 1.1935e-15},
{'Time': 18057610.0, 'String_11': 1.1624},
{'Time': 18057610.0, 'String_12': -6.1692e-15},
{'Time': 18057610.0, 'String_13': 3.2218e-15},
{'Time': 18057620.4, 'String_8': 2.4377e-16},
{'Time': 18057620.4, 'String_9': -6.2809e-15},
{'Time': 18057620.4, 'String_10': 1.6456e-15},
{'Time': 18057620.4, 'String_11': 1.1651},
{'Time': 18057620.4, 'String_12': 1.7147e-15},
{'Time': 18057620.4, 'String_13': 9.8872e-16},
{'Time': 18057631.1, 'String_8': 4.1124e-15},
{'Time': 18057631.1, 'String_9': 1.5598e-15},
{'Time': 18057631.1, 'String_10': -2.325e-16},
{'Time': 18057631.1, 'String_11': 1.1638},
{'Time': 18057631.1, 'String_12': -3.9983e-15},
{'Time': 18057631.1, 'String_13': 4.459e-16}]
From this one I want to get to get the following dataframe
df=
String 8 String 9 ... String 12 String 13
Time ...
1.80576100e+07 -1.4209e-15 2.7353e-16 ... -6.1692e-15 3.2218e-15
1.80576204e+07 2.4377e-16 -6.2809e-15 ... 1.7147e-15 9.8872e-16
1.80576311e+07 4.1124e-15 1.5598e-15 ... -3.9983e-15 4.4590e-16
below is the code I have tried but it takes all the values of 'Time' key thus I can't use pd.DataFrame(dd)
dd = defaultdict(list)
for d in data:
for k, v in d.items():
dd[k].append(v)
I also tried a=dict(ChainMap(*data))
with no luck. Thanks.
Solution
You can try itertools.groupby
:
from itertools import groupby
from collections import ChainMap
df = pd.DataFrame([ChainMap(*g) for _, g in groupby(data, lambda k: k["Time"])])
print(df.set_index("Time"))
Prints:
String_10 String_11 String_12 String_13 String_8 String_9
Time
18057610.0 1.193500e-15 1.1624 -6.169200e-15 3.221800e-15 -1.420900e-15 2.735300e-16
18057620.4 1.645600e-15 1.1651 1.714700e-15 9.887200e-16 2.437700e-16 -6.280900e-15
18057631.1 -2.325000e-16 1.1638 -3.998300e-15 4.459000e-16 4.112400e-15 1.559800e-15
Answered By - Andrej Kesely
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.