Issue
i have this df:
ID COL1 COL2 COL3
0 ZBC174 TYPE 4.0 NaN
1 NaN ID ZBC174 NaN
2 NaN ROW1 50 NaN
3 NaN ROW2 0 NaN
4 NaN ROW3 0 NaN
5 NaN ROW4 2 NaN
6 NaN 75.00 -2.45 1.0
7 NaN 115.50 -1.73 1.0
8 NaN ROW7 0 NaN
9 NaN ROW8 4.034 NaN
10 NaN ROW9 8 NaN
11 NaN 115.4999712 -1.73 0.7
12 NaN 122.000571 -0.17 0.7
13 NaN 126.9999986 -0.118 0.9
14 NaN 129.5000362 0.466 0.9
15 NaN 134.4994643 1.757 1.0
16 NaN 140.0004388 1.994 1.0
17 NaN 145.5000039 3.339 1.0
18 NaN 148.3417866 4.034 1.0
19 NaN NaN NaN NaN
20 ZBC173 TYPE 4.0 NaN
21 NaN ID ZBC173 NaN
22 NaN ROW1 51.977 NaN
23 NaN ROW2 0 NaN
24 NaN ROW3 0 NaN
25 NaN ROW4 2 NaN
26 NaN 81.00 -4.42 1.0
27 NaN 114.00 -1.67 1.0
28 NaN ROW7 0 NaN
29 NaN ROW8 3.696 NaN
30 NaN ROW9 7 NaN
31 NaN 113.9996969 -1.67 0.7
32 NaN 121.500409 0 0.7
33 NaN 127.9995187 0.066 0.9
34 NaN 129.4998186 0.285 0.9
35 NaN 134.4992436 1.779 1.0
36 NaN 145.9999685 2.144 1.0
37 NaN 153.2586833 3.696 1.0
i expect 2 txt files from each ID in the first column:
ZBC173.txt and ZBC174.txt
Expected output is for one txt file (ZBC174.txt) :
TYPE 4.0
ID ZBC174
ROW1 50
ROW2 0
ROW3 0
ROW4 2
75.000 -2.450 1.00
115.500 -1.730 1.00
ROW7 0
ROW8 4.03
ROW9 8
115.500 -1.730 0.700
122.000 -0.170 0.700
127.000 -0.120 0.900
129.500 0.470 0.900
134.500 1.760 1.000
140.000 1.990 1.000
145.500 3.340 1.000
148.340 4.030 1.000
my code:
import os
import csv
import pandas as pd
import numpy as np
df = pd.read_csv(r'C:\Users\mycsv.csv', sep= ';')
df = df.dropna(subset=['COL2'])
df = df.groupby('ID', as_index=False).nth(0,dropna=False)
print df
output from my code is :
ID COL1 COL2 COL3
0 ZBC174 TYPE 4.0 NaN
1 NaN ID ZBC174 NaN
20 ZBC173 TYPE 4.0 NaN
if i am not clear about my question please let me know. I am a beginner for coding. Thank you for your understanding.
Solution
you can try with:
folder = your folder path
for _,g in df.groupby(df['ID'].notna().cumsum()):
g.iloc[:,1:].dropna(how='all').to_csv(f"{folder}\\{g.iloc[0,0]}.txt",index=False)
Output I got:
Answered By - anky
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.