Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"I use IQ Feed, Great stuff as far as data analysis information, storage and retrieval is concerned." - Comment from Public Forum
"And by the way, have to say this. I love the IQFeed software. It's rock solid and it has a really nice API." - Comment from Thomas via RT Chat
"Thanks for the great product and support. During this week of high volume trading, my QuoteTracker + IQ Feed setup never missed a beat. Also, thanks for your swiftness in responding to data issues. I was on ******* for a few years before I made the switch over early this year, and wish I had done it a long time ago." - Comment from Ken
"If someone needs the best quality data and backfill beyond what their broker provides at a rate that is the best in the industry, I highly recommend IQFeed." - Comment from Josh via Public Forum
"I'm satisfied with IQFeed. It's the most reliable and fastest quote feed I have ever used. Although I'm a resident in China, it's still very fast!" - Comment from Xiaofei
"I just wanted to say how happy I am with your service. I was able to download the API docs last week and I was able to replicate Interactive Brokers historical bar queries and realtime bar queries over the weekend. That was about one of the fastest integrations that I've ever done and it works perfectly!!!!" - Comment from Jason via Email
"DTN feed was the only feed that consistently matched Bloomberg feed for BID/ASK data verification work these past years......DTN feed is a must for my supply & demand based trading using Cumulative Delta" - Comment from Public Forum Post
"The service is great, I see a noticeable improvement in my volume profiles over [broker]'s data feed" - Comment from Larry
"If you want customer service that answers the phone, your best bet is IQFeed. I cannot stop praising them or their technical support. They are always there for you, and they are quick. I have used ****** too but the best value is IQFeed." - Comment from Public Forum
"I am keeping IQFeed, much better reliabilty than *******. I may refer a few other people in the office to switch as well." - Comment from Don
Home  Search  Register  Login  Blogs Recent Posts

Information on Various DTN Products:
DTN IQFeed | DTN ProphetX | DTN Ag | NxCore
Follow DTN_IQFeed on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »IQFeed Developer »IQFeed Developer Support »when busy trading,CPU usage up to 100%
Author Topic: when busy trading,CPU usage up to 100% (4 messages, Page 1 of 1)

milcloud
-Interested User-
Posts: 8
Joined: Oct 14, 2016


Posted: Oct 21, 2016 10:34 AM          Msg. 1 of 4
we just register about 40 symbols from iqconnect, such as CL,HG,GC,SI and so on.
during day trading, when the trading is busy, the traffic form iqconnect will up to 600kb/s some time 1000kb/s ,then cpu will run up to 100%, and when that happens, the tick will stay in buffer, and our software can not deal it as qucik as the new tick reached.

the software we receive tick do not do much complex work, just parse the tick line form iqconnect and convert to the struct we need.

my process method:
1.recv string line form iqconnect in threadA and put tick string in RingBuffer
2.parse the string line in another threadB

server: 2Core 2G Windows2003
language: C#

so questions
1.any other one meet this problem too? how to deal this problem.
2.do iqfeed provider snapshot of the tick such as 250ms or 500ms? then we will no need do handle so much Ticks.



i have put the tick parse function in the attachment.
Edited by milcloud on Oct 21, 2016 at 10:35 AM
Edited by milcloud on Oct 21, 2016 at 10:37 AM



File Attached: parse_function.txt (downloaded 285 times)

milcloud
-Interested User-
Posts: 8
Joined: Oct 14, 2016


Posted: Oct 21, 2016 10:39 AM          Msg. 2 of 4
i saved the statistic of server as attachement.



File Attached: server statistic.png (downloaded 221 times)

DTN_Steve_S
-DTN Guru-
Posts: 2034
Joined: Nov 21, 2005


Posted: Oct 21, 2016 02:21 PM          Msg. 3 of 4
Unfortunately we don't provide a snapshot data option at this time and if implemented, the likely lower limit would be 1s.

With that said, I took a look at the code you provided and the obvious thing that jumps out at me is the use of String.Split. This function allocates a new array and a new string object for each and every field on each and every message. When dealing with potentially thousands of messages per second (possibly 10s of thousands), this is going to be very inefficient, especially since these are temporary objects and you are immediately converting the fields to binary. In order to efficiently process the feed, you need to eliminate as many of these types of temporary variables as possible in your processing.

Also, make sure you are using the dynamic fieldsets feature of the feed to eliminate any fields that you aren't interested in processing. I can't tell from the code snipit if you are using this or not but make sure you are.

milcloud
-Interested User-
Posts: 8
Joined: Oct 14, 2016


Posted: Oct 22, 2016 08:01 AM          Msg. 4 of 4
i use dynamic fieldsets, and thanks for your suggestion, i will try to get some efficient way to handle this problem.

thanks.
 

 

Time: Fri February 23, 2018 1:50 PM CFBB v1.2.0 16 ms.
© AderSoftware 2002-2003