Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
138 views
in Technique[技术] by (71.8m points)

python - Error looping script over a directory with exactly the same file type

Hi all this is the original script:

import pandas as pd
import numpy as np
file_name = "text.txt"
headers = [*pd.read_csv("C:/text.txt", sep='', nrows=1)]
df = pd.read_csv("C:/text.txt", sep='', usecols=[c for c in headers if c != 'filenotes'], low_memory=False)
df2 = df.iloc[:, np.r_[7:207]]
a = df2.sum(0).tolist()
df['Total Count per Bin'] = a + ((len(df) - len(a)) * [np.nan])
df.to_csv("C:/Users/test/{}.txt".format(file_name.split('B')[0]), columns=['Bins', 'Total Count per Bin'], sep='', index=False)

As seen by the code above the file has 200 columns and the total count for each column was inverted to a column itself and aligned with the Bin column which has 200 bins and then I exported only the binned data.

 Bins Total Count
    3  250
    6  560
    9  100

I have files for each day of the year, exactly the same format (I want concat these files based on date after finding the min median and max. But all this is dictated on figuring out how to run this script over a directory)

Now onto the code issue:

import pandas as pd
import numpy as np
import os

os.chdir("C:/")

for file_name in os.listdir( "C:/"):
    if file_name.endswith(".txt"):

        headers = [*pd.read_csv( "C:/",sep='', nrows=1)]
        df = pd.read_csv("C:/",sep='', usecols=[c for c in headers if c != 'filenotes'], low_memory=False)
        df2 = df.iloc[:, np.r_[7:207]]
        a = df2.sum(0).tolist()
        df['Total Count per Bin'] = a + ((len(df) - len(a)) * [np.nan])
        df.to_csv("C:/binned/{}.txt".format(
            file_name.split('B')[0]),
                  columns=['Bins', 'Total Count per Bin'], sep='', index=False)

Edit:

import pandas as pd
import numpy as np
import os

directory = 'C:/'

for filename in os.listdir(directory):
    if filename.endswith(".txt"):
        headers = [*pd.read_csv(directory + '/' + filename, sep='', nrows=1)]
        df = pd.read_csv(directory + '/' + filename, sep='', usecols=[c for c in headers if c != 'filenotes'],
                         low_memory=False)
        df2 = df.iloc[:, np.r_[7:207]]
        a = df2.sum(0).tolist()
        df['Total'] = a + ((len(df) - len(a)) * [np.nan])
        df.to_csv("C:/binned/{}.txt".format(
            filename.split('B')[0]),
            columns=['Bins', 'Total Count per Bin'], sep='', index=False)

This third example above is seemingly working, currently trying this over a file directory as I type.

The attached image is the error message which ends with: PermissionError: [Errno 13] Permission denied: 'C:/' enter image description here

question from:https://stackoverflow.com/questions/65905562/error-looping-script-over-a-directory-with-exactly-the-same-file-type

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Formatting it like [Below] works. Thank you all for trying to help!!!

      import pandas as pd
      import numpy as np
        import os
        
        directory = 'C:/'
        
        for filename in os.listdir(directory):
            if filename.endswith(".txt"):
                headers = [*pd.read_csv(directory + '/' + filename, sep='', nrows=1)]
                df = pd.read_csv(directory + '/' + filename, sep='', usecols=[c for c in headers if c != 'filenotes'],
                                 low_memory=False)
                df2 = df.iloc[:, np.r_[7:207]]
                a = df2.sum(0).tolist()
                df['Total'] = a + ((len(df) - len(a)) * [np.nan])
                df.to_csv("C:/binned/{}.txt".format(
                    filename.split('B')[0]),
                    columns=['Bins', 'Total Count per Bin'], sep='', index=False)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...