Discussion:
WIFI Synchronization
(too old to reply)
Ian Ren
2008-08-11 13:10:09 UTC
Permalink
Can someone tell me if I have an USB-9234 and a WLS-9234 connected to the same laptop computer, what is the typical delay between the two DAQ systems? within 1ms?
 
Thanks.
 
Ian 
 
 
muks
2008-08-11 16:10:09 UTC
Permalink
What exactly do u mean by "Delay" here? R u using LabVIEW?
Ian Ren
2008-08-12 12:40:06 UTC
Permalink
1) When I say "delay", I mean if I "Tee" the same signal to a USB 9234 and a WLS 9234, what is the typical off-set in time between signals captured by the two systems.
2) I am going to use LabVIEW.
What is the best way to start a data acquisition task on these two systems at the same time. There is no start trigger that can be shared between the USB and WIFI syetm, is this correct? 
Thanks.
Ian
preston johnson
2008-08-12 13:40:16 UTC
Permalink
both wifi and usb communications have start-up delays which are not deterministic.  The best way to insure you can line up the data is to record a reference signal on both devices.  Then, you can align the time traces from both devices to this common reference.  One advanced way to do this is to use the resampling functions in our order analysis tools contained in the Sound and Vibration Measurement Suite.  There are other tools in LabVIEW Full which can help as well. 
 
Ian Ren
2008-08-12 14:10:07 UTC
Permalink
Preston
 
A project I am going to work on can probably accept 1-2 ms offset.
 
Recording a reference signal on both devices is a good idea, but it will use two additional channels, and it is not as easy to send a reference signal to two DAQ devices that are moving at different speeds.
 
Thanks.
 
Ian  
Chris_D
2008-08-12 21:40:19 UTC
Permalink
Hey Ian,You can actually share a trigger using the PFI line on the wireless sleeve. You could input a digital pulse on this PFI line and the PFI line on the USB DAQ to start the tasks at exactly the same time, however, this would require physically wiring between the two devices. Really, the delay between the data traveling between the DAQ device and the computer is irrelevant as it will not affect the acquisition. If the travel time takes too long then you will get a timeout error because of no samples in the buffer just as you would any other DAQ device. When the data is read back in LabVIEW it will be displayed based on the timestamp when it was acquired not when it arrived at the computer.Currently, there is no way to do wireless synchronization. The closest thing would be just starting the tasks at the same time. However, there are some emerging technologies that may make wireless synchronization possible in future. You can read more in the following article.Incorporating Wireless Measurements with Wired Data Acquisition Systems<a href="http://zone.ni.com/devzone/cda/tut/p/id/7137" target="_blank">http://zone.ni.com/devzone/cda/tut/p/id/7137</a>
Ian Ren
2008-08-13 12:40:19 UTC
Permalink
Chris&nbsp;Thanks for your help. Unfortunately, for the project I am interested in, I cannot physically wire between the two devices.&nbsp;How accurate is the timestamp returned from the DAQ vi? I recall it is not exactely the same as the timestamp that the 1st data point&nbsp;is acquired.&nbsp;&nbsp;If the difference between the time stamp returned from the DAQ vi and the actual timestamp that the 1st data point is acquired is less than 1-2 ms, then I can live with it. The WIFI will be&nbsp;the perfect solution for me.&nbsp;Thanks.&nbsp;Ian&nbsp;
Chris_D
2008-08-14 18:10:08 UTC
Permalink
Ian,

To clarify, the timestamp or t0 is actually taken from the system time when the Read is first called. The dt is calculated from the sample rate. There is more information on this in the KB below.

Why Is the Waveform Timing Information Returned by NI-DAQmx Incorrect?
<a href="http://" target="_blank" title="http://digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA?OpenDocument">http://digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA?OpenDocument</a>

There is currently no spec for the time difference between when the Read is called and when the first sample is actually taken. This will be difficult to spec because it would all depend on the enviroment, network traffic, distance from router, etc. When the task is started, it would send a message to the device telling it to initiate the acquisition and if the Read is called immediately after the Start Task is called the time difference should be minimal.

We do, however, know that the dt will be more precise because the time between when each sample is actually taken will be determined by the internal clock.

Sorry for not being able to provide a more exact answer, but it really depends on your setup and enviroment.
Ian Ren
2008-08-14 19:40:06 UTC
Permalink
Chris&nbsp;Thanks for the info.&nbsp;At the moment,&nbsp;I just want to get a ballpark number (is it a few ms, or a few hundreds ms). Would it be possible for NI to do a bench test&nbsp;using a typical laptop computer&nbsp;plus&nbsp;one USB and one WIFI 9234 modules?&nbsp;I understand&nbsp;the number obtained will depend upon the actual setup and cannot be &quot;quoted&quot;, but at least it will let me know what I should expect.&nbsp;Thanks&nbsp;Ian&nbsp;&nbsp;&nbsp;&nbsp;
Chris_D
2008-08-15 21:40:18 UTC
Permalink
Ian,

I was able to setup a quick test to give you a ballpark figure. To perform the test I set up a single point on-demand read. This means that exactly when the read is called it will send a message to the DAQ device, acquire one sample, and return the sample from the read VI. I used a Tick Count on either side of the read and took the difference to find the exact time it took to take one sample. This will give us a round-trip number, so really you could divide that by 2 to get an estimated time between when the task is started and the sample is actually taken.

With the device fairly close to the router and only one device connected it took around an average of .2 milliseconds round trip. However, increasing the distance from the router and adding more devices, meaning more network traffic, increased the time up to around 1 millisecond.

I have to reiterate one more time that this is no guarantee that you will see the same performance or numbers and it all depends on your setup and environment. Hopefully this will give you some sort of guesstimate on the approximate delay.
Continue reading on narkive:
Loading...