OscarJ
2008-08-13 11:40:22 UTC
Hallo Alles! :robotsurprised: I've been trying to do some analog voltage measurments with an NI 6221-card in LabVIEW, but I was thinking that since the specification for this card lists four different input ranges available (+/- 10, 5, 1, and 0,2 V) this input range could be set to the smallest range required for a specific measurment to acquire higher accuracy, am I right or what? How is this accompolished on the block diagram? I've managed to generate the following via the DAQ Assistant:<img src="Loading Image..." border="0" width="572" height="334"> Could the minimum/maximum input values be changed to achieve this? Maybe this code within a while loop + a case structure could redo the measurment until the smallest possible range was used? And another thing, I've never done any of this "ni-daq programming" before, and I'm still pretty unexperienced with LabVIEW :robotvery-happy:, but what other ni-daq actions should be included before and after a measurment like this takes place in an application? Should there any "init/clear"-commands or something? What else must one think of when acquiring data like this? Thanks alot in advance :robotwink: