Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"Thank God for your Data Feed as the only Zippers I see are on my pants (LOL), and no more 200 pip spikes to mess up charts." - Comment from Spiro via Email
"I am very happy I changed. I love the product, but more so I am thrilled with Tech Support. You are knowledgeable, polite, pleasant and professional." - Comment from Pat
"I am a hedge fund manager here. It’s funny, I have a Bloomberg terminal and a Bridge feed, but I still like having my DTN feed!" - Comment from Feras
"Can I get another account from you? I am tired of ******* going down so often" - Comment from George
"DTN has never given me problems. It is incredibly stable. In fact I've occasionally lost the data feed from Interactive Brokers, but still been able to trade because I'm getting good data from DTN." - Comment from Leighton
"Thank you so much - awesome feed, awesome service!" - Comment from Greg via Email
"I ran your IQFeed DDE vs. my broker vs. a level II window for some slow-moving options. I would see the level II quote change, then your feed update instantaneously. My broker's DDE, however, would take as much as 30 seconds to update. I am not chasing milliseconds, but half a minute is unacceptable." - Comment from Rob
"I have been using IQFeed now for a few years in MultiCharts and I have zero complaints. Very, very rare to have any data hiccups or anything at all go wrong." - Comment from Public Forum
"IQFeed version 4 is a real screamer compared to anything else I have seen." - Comment from Tom
"I just wanted to say how happy I am with your service. I was able to download the API docs last week and I was able to replicate Interactive Brokers historical bar queries and realtime bar queries over the weekend. That was about one of the fastest integrations that I've ever done and it works perfectly!!!!" - Comment from Jason via Email
Home  Search  Register  Login  Recent Posts

Information on DTN's Industries:
DTN Oil & Gas | DTN Trading | DTN Agriculture | DTN Weather
Follow DTNMarkets on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »IQFeed Developer »IQFeed Developer Support »Level 1 Socket Java App (Fundamental Data Question)
Author Topic: Level 1 Socket Java App (Fundamental Data Question) (1 messages, Page 1 of 1)

codingcorners
-Interested User-
Posts: 1
Joined: Aug 5, 2016


Posted: Aug 5, 2016 09:03 AM          Msg. 1 of 1
Hi all, I'm building a Java app for a client based off of the same idea as the sample Level1Socket.java app in the developer examples. The gist of it is basically a socket listener app (command line, no GUI) that..

1. Sets the fieldset to 'Symbol','Last','Total Volume', 'High','Low','Number of Trades Today', and 'VWAP'.

2. It then loads in a CSV 'symbol loader' file that has one symbol per line (about 600 in total right now, but the plan is for there to be closer to 6,000 when at expected capacity) and runs the 'w[SYMBOL]' Watch command on each of them.

3. Next it stores the returned summary lines (either P or Q) in an ArrayList. It ignores all T, F, and n lines.

4. A separate Timer thread every three seconds will loop through this ArrayList and save the data to a MySQL database, Updating if the Symbol already has a row in there, or Inserting if not.


My question is, there's a lot of data that I don't necessarily need to come through the feed. The 'F' Fundamental lines in particular - is there any way to turn those off like there is with Timestamps and News lines?

Another question - Is this the best way to get the most current data from IQFeed in realtime? I was hoping there might be a way to do a batch poll of a bunch of Symbols, specifically looking for a certain fieldset, and to be able to re-poll that data every 2 seconds to look for change, rather than wait on the variability of a socket session to spit out data (which is providing some inconsistent results when we go to save data to our DB).

I'm coming to find that I can't get enough good data saved from the feed to do accurate data processing into my database table, and I'm guessing it's got to be because there's too much other data that the reader is dealing with to be able to keep up with the data as it's coming down the pipeline, so I'm trying to find a way to free it up and save some bandwidth.

Any help would be greatly appreciated, I've been banging my head against the wall all night on this! :)
 

 

Time: Mon July 15, 2019 3:27 PM CFBB v1.2.0 0 ms.
© AderSoftware 2002-2003