Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
If someone can please point me in the right direction on some basic programming and data collection
my skills are non existent but I can figure it out if I know what I am looking for... I want to build a database that will get data from web pages daily and store it, also download files and store them and tie it all together so I can spit out a daily summary
I know that in excel I can use "data from web" but many of the pages I use it doesn't work, so should i just load the whole page into excel and have it delete the stuff I don't need or is there a better way ? Here is an example of a page I would like to get the table data in the table and download the csv file in the link at the end of each line http://www.patterntrapper.com/!Data_Futures.shtml
I have Access , Excel , Visual Studio not sure what is the most efficient and easiest to setup and begin to build a historical database that will become automated without spending years to learn how to do it? Thanks a bunch
Can you help answer these questions from other members on NexusFi?
I would probably use Visual Studio and write a simple application that first retrieves all of the .csv files and then parses the data that you want.
VB.Net, for example, has a lot of built-in routines for retrieving files off the web and parsing text. The nice thing about that particular webpage that you referenced in your post is that the .csv files have the same name everyday. You can just create a list of their addresses and have the program cycle through, retrieve each file at a designated time of day and then parse each one.
You indicated in your post that you want to automate the collection of this data everyday. Do you simply want to create a databse with this data or do you want to run some additional analysis on it?
Thanks, and yes to be able to run additional analysis and collect data from other places and have it all together in one place for analysis, i guess , thanks again
What kind of other data would you look to be pulling down? Is that data as user-friendly as the .csv files you referenced? Also, what kind of output do you want? Charts? Tables? Where things get a bit laborious is when you have to write individual routines to scrape information off of particular websites. Also, for example, if you wanted to retrieve historical data and then chart it -- well, there are lots of programs already available that do charting well, so no point in reinventing the wheel.
most of the other stuff will work with excel basic web query , so what to use for actual database ? access or visual studio?
The data analysis will just be to get an overview of the market as a whole on one page , doesn't have to be with charts just perhaps visually pleasing
ideally I guess I would like to have it presented within it's parameters meaning for example last price in the context of pivots moving averages, and other levels
With Visual Studio, you can easily connect to an Access database (there's a lot of pre-written code for doing that) or you can use an SQL-Lite database (I believe Visual Studio ships with SQL-Lite). If you already own the Microsoft Access program, it may make setting up the database a bit easier.
What I would recommend is writing out a list of all of the statistics you want to generate, and for which markets. You could also do a mock-up of how you want it to look. This information in turn will drive (a) where you get the data, (b) how much data you need to save and (c) what kind of formulas you need to be able to write.
there are so many languages and programs and just want to learn it and get it done with some flexibility , thanks for pointing me in the right direction , much appreciated
i agree with furytrader. go for visual studios. since you are using NinjaTrader, go for C# to be precise.
for this job, google httpwebrequest and regex. that will enable you to scrap the data to csv files. if you want something in excel then do explore Gummy Gone my tryst with programming began with the excel macro from there only.