Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"There is no doubt that IQFeed is the best data provider. I am very satisfied with your services. And IQFeed is the only one that I would recommend to my friends. Now, most of them are using your product in China." - Comment from Zhezhe
"Boy, probably spent a thousand hours trying to get ******* API to work right. And now two hours to have something running with IQFeed. Hmmm, guess I was pretty stupid to fight rather than switch all this time. And have gotten more customer service from you guys already than total from them… in five years." - Comment from Jim
"IQ feed works very well, does not have all of the normal interruptions I have grown used to on *******" - Comment from Mark
"I just wanted to let u know that your data feed/service is by far the best!!! Your unfiltered tick data is excellent for reading order flow and none of your competitors delivers this quality of data!" - Comment from Peter via Email
"As a past ******* customer(and not a happy one), IQ Feed by DTN is a much better and cheaper product with great customer support. I have had no problems at all since switching over." - Comment from Public Forum
"I just wanted to let you know how fast and easy I found it to integrate IQFeed into our existing Java code using your JNI client. In my experience, such things almost never go so smoothly - great job!" - Comment from Nate
"I've never had DTN go out on me since switching. ******* would go down a couple times every month when I was using them." - Comment from Bryce in AL.
"DTN feed was the only feed that consistently matched Bloomberg feed for BID/ASK data verification work these past years......DTN feed is a must for my supply & demand based trading using Cumulative Delta" - Comment from Public Forum Post
"This is an excellent value, the system is generous (allowing for 500 stocks) and stable (and really is tick-by-tick), and the support is fantastic." - Comment from Shirin via Email
"I started a trial a few weeks back before the market went wild. DTN.IQ didn’t miss anything and beat my other provider. I decided to stay with you because of the great service through all the volatility." - Comment from Mike
Home  Search  Register  Login  Blogs Recent Posts

Information on Various DTN Products:
DTN IQFeed | DTN ProphetX | DTN Ag | NxCore
Follow DTN_IQFeed on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »IQFeed Developer »IQFeed Developer Support »Fast market probelm
Author Topic: Fast market probelm (4 messages, Page 1 of 1)

mac
-Interested User-
Posts: 8
Joined: Apr 6, 2017


Posted: Jun 28, 2017 09:44 AM          Msg. 1 of 4
I am relatively new to IQFeed and to socket programming in general. I’m coding in .Net. I’m basically doing the same socket reads as the example code, Level1Socket. I was naive to think I could keep up with the socket, I think.

I haven’t been monitoring the Admin socket so I don’t know how often I have queued data. I happened to be trading in a fast market at 08:17 - 08:18 ET June 28 when there was significant futures movement. I only watch 5 symbols; Globex currencies Euro, Japanese Yen, Australian Dollar, E-mini Nasdaq and AAPL. Euro currency had a very fast move at this time. I log level 1 update (bid/ask) latency during a trade. I was seeing 25-35 SECOND latencies.

After seeing this, I opened the Diagnostic program for market open at 9:30 ET to see if it reported any queued data. It showed none.

My machine clock is managed by a windows port of ntp.org software, so it is quite accurate (always within a few milliseconds of ntp time).

Is it possible for IQFeed to see if there were an abnormal amount of prints in this period relative to market open. I recorded around 6000 prints in the 08:18:00-08:18:59 minute for Euro Currency Sept 2017 (100 per second).

Unfortunately, having never seen this before, I didn’t spend too much effort optimizing my software. Every print, regardless of the product, fires an event with 5 subscribing class instances, one subscription per product. In the even handler, I check to see if the event args match the product I’m interest in. Maybe a better approach would be to open 5 different sockets, one per product. This could expand to 15 or 20 products in the future but not 100’s. This way I could read from the back of the socket? If a print is old, just throw it out. I just want the most recent info. If I had one socket per product, I could just grab the last update and throw the rest away. Also, this would enable me to fire an event with only one subscriber.

Also unfortunate, I don’t know where the bottleneck is.

Any thoughts?

MAC

mac
-Interested User-
Posts: 8
Joined: Apr 6, 2017


Posted: Jun 28, 2017 10:43 AM          Msg. 2 of 4
I should note, the numbers quoted above (100 prints per second) were filtered prints. These were only logged if Bid or Ask price changed. I was not logging changes in size at the same price. It is likely much higher than 100 per second.

That said, I'm running a test right now that shows no latency issues with similar filtering. I'm getting fewer price changes since there isn't a fast market but only on the order of 30%-50%. I'm sampling every 10 seconds and see around 30-50 bid/ask price changes per second for ECU7. I am however subscribed to all 5 products. Average latencies are normal at around 75 milliseconds.

It's starting to look like my app can handle the throughput.

Any reported problems at IQFeed this morning?

MAC

mac
-Interested User-
Posts: 8
Joined: Apr 6, 2017


Posted: Jun 28, 2017 11:25 AM          Msg. 3 of 4
OK, my bad. I see 61,000 prints in the 8:17 minute.

That said, should a C# program be able to handle 1000+ prints per second? Is it possible?

MAC

DTN_Tim Walter
-DTN Guru-
Posts: 1090
Joined: Apr 25, 2006


Posted: Jun 28, 2017 11:35 AM          Msg. 4 of 4
Many people watch 1800 symbols without issue, so it is certainly doable. So don't worry, we can get this.

Are you printing messages to a console somewhere? If so, that is the first thing that will have to go. Other things include inserting or writing on each update, build a 1000 or so messages before doing a batch insert. Basically, anything that happens between read of the socket 1 and read of the socket 2 needs to be looked at and considered. If there is any heavy processing happening between those, you would want to move that to a separate thread.

If you have more specific questions you can also reach me at the email listed at the bottom of this page, https://www.iqfeed.net/dev/main.cfm

Tim
 

 

Time: Sat November 18, 2017 5:16 PM CFBB v1.2.0 16 ms.
© AderSoftware 2002-2003