OK... parser is done. This is going to be a data monster. Roughly 24,000 data points/hr. 660,000/week. 34.3M/year.
We'll see how it goes, but might have to purge out expired datasets after a set timeframe. Is there any value in retaining them for analysis and if so, anyone want to submit an idea for a reasonable timeframe?
Just have to wrap up the scheduler and should have this running for Monday.
Post by qualitywte on Sept 22, 2012 17:12:14 GMT -5
Thanks for doing this. I do think the collective data/knowledge of this group is very valuable and this will help distill/compile it. Sometimes anecdotal data can be combined together and be very informative.
Can you do private distribution ? Its been a while since I was a developer. Or, charge $ 10 and enjoy some benefit.
The intent is to use this as a data source for an iOS app that we can all use. The features we include in the app is wide open at this point. Some features will be quite obvious, but I'm hoping to push the limits with this.
I like the idea of crunching a lot of technicals (SMA/EMA/MACD/etc) on the back-end so users can set up watch conditions and get push notifications. For spreads, could have the program analyze all the option prices and find the best spreads given certain user submitted criteria.
As for distribution, not sure of the best way to distribute this. App store is easiest, but I don't necessarily want to support the infrastructure for a potentially large user base. Looking into Enterprise distribution. Then it could just be distributed to members of this board. Distribution and push notification is a little more difficult with Enterprise distribution but can worry about that down the road.
To start, I'll probably just look for 20 or so co-developers/beta-testers. Want people who can actively contribute technical knowledge and direction. This is going to be a long and slow development project. Once we figure there's enough utility in it, we'll work out the distribution details.
So parser is running and minus a few glitches, is now feeding me my boatload of data. Now what?
I thought I had ideas of how I could utilize this information, but as a relative noob, now that it comes down to it, I'm not so sure. Still looking for ideas of how to utilize this information. What kind of analysis would be useful. Until I figure out some direction for the app, I could crunch out a daily report and post here but what would really be of use to everyone?
I've got 1 minute stock/volume quotes (realtime) and 5 minute option chains and VXAPL (delayed 15 minutes). I was hoping for some real-time options alert functionality but the 15 minute delay kinda kills that. I'm thinking capturing 5 minute intervals might be unnecessary... especially considering the amount of data it generates. More possibilities with the stock quote since its at least real time.
I originally thought I could take this information and find the best spreads. I've seen comments in the past of people saying "Why did you pick that spread when you could have had this one with y times better return and less risk". I could number crunch every spread from 5 to 50 dollars for every strike and every expiration but how would you quantify which is better with respect to risk. Obviously the returns are better the more OTM so just calculating for best return is pointless.
Welcoming all ideas. If you bring it, I will write it.
One thing I was thinking of doing is just some OI plots for all expiration dates instead of just the monthly and weekly ones Travis has. You could also show how the OI changes over time (3D plot perhaps?).
I'm also interested in seeing charts on options prices, or even more interesting prices of different strategies like spreads over time. It might also help to show a band on these charts to indicate the bid-ask spread.
Here's a sample from today capturing the regular trading session. This is just graphed from the data. I would want to figure out what kind of visualizations would be appropriate before getting into coding. The 700C graph shows bid/ask/price but the ranges are so tight it just kind of blends into one.
Still need to integrate your legacy data, then can see what we can do with OI data over time.
OK. Added Apr 2013 680/700 BCS pricing. I have to admit, I'm not sure how you chart the bid/ask spread. If I calculate 680Bid-700Bid and 680Ask-700Ask, it's all over the map... sometimes the Ask is below the Bid. Is there a proper way to represent these or is this not a useful calculation?
Last Edit: Sept 25, 2012 23:37:40 GMT -5 by cambrose
Post by Tetrachloride on Sept 27, 2012 9:48:44 GMT -5
At the moment, my deeper analysis method is watching other major companies and indices. GE, Nikkei , HangSeng, Walmart. More longer term charts for those. HangSeng is high, Nikkei is in its usual languish.
How far back does the data go? Depending on that, I have some ideas on back testing option strategies against actual historical data. That would be really useful for me at least.
I have TOS on TDameritrade. One of the newer features on TOS is ON DEMAND. This feature sends you back in time with a balance of $100,000 to make stock purchases, puts and calls etc. Using this feature takes a LOT of bandwidth and warms up my MacBook. Not every OTM call or put is covered. BUT it covers EVERY stock tick by tick by the second. The time machine goes back to 2009.
has never cared about the short term price of AAPL! & "We run the business for the long term, not the 90-day clock"
sponge: Regarding the future of VR, I think it will be huge. I was a gamer when I was in college. But as an adult I lost interest. Last fall I flew up to visit my son at college and check out his new Vive set up. After playing with it for the weekend, I was
Apr 29, 2018 15:25:17 GMT -5
galleybob: thanks for your answer. I will copy and send to her
Nov 7, 2017 15:32:18 GMT -5
rickag: So since Jan 28th 2015 AAPL is up from 117.27 to 157.21
Aug 21, 2017 20:09:43 GMT -5
artman1033: VXAPL = 29.21 AAPL = $117.27 AFTER EARNINGS
Jan 28, 2015 14:54:46 GMT -5
artman1033: VXAPL = 44.94 AAPL = $110.39 BEFORE EARNINGS
Jan 27, 2015 11:12:53 GMT -5