0

How to import a data frame from a website in Python?

Hello. I'm writing a program for my college project. There are stock market/exchange sites which allow you to download their daily data in .txt, .csv, etc. How can I import them directly from the website using a url or something? Is there some modules for these needs? Sorry for my english. Thanks

5th Feb 2017, 3:43 PM
UsedC
3 odpowiedzi
+ 1
As a start if you want to script the downloading process Read about the capacities of standard python modules like socket and urllib for network programming, you will quickly discover there are multiple options beyond the standard library too. If you want to develop for manipulation of the data, you may want to start your reading about the subject with searches on the documentation for numpy and pandas. ( Making dataframes in pandas from csv files is straightforward.) Beyond that check out the integrated package manager Anaconda and it's relationship to IPython, Scipy,Matplotlib and particularly Jupyter Notebook (It is a great tool for manipulation and visualization of pandas dataframes)
5th Feb 2017, 5:39 PM
richard
+ 1
# If you have already used Pandas simply import # files as a dataframe using .read_excel() or .read_csv() using the url # directly. # It's easy to pickle them too # It's easier than studying the underlying processes. import pandas as pd # basic outline with excel file data_url_xl = 'url.you.want.data.from.xls' my_data = pd.read_excel(data_url_xl) my_data.head() # store it for later as a pickled file file_location = r'Path/to/file/filename' stored_data = my_data.to_pickle(file_location) # retrieve the dataframe from the pickled file get_data = pd.read_pickle(r'Path/to/stored_data')
10th Feb 2017, 2:01 AM
richard
0
Thanks! I already have some experience with numpy, pandas and matplotlib. Now I need to study these network modules
6th Feb 2017, 10:26 AM
UsedC