Discussion:
Momentary switches and sequential logic--- seeminly not available in Labview
(too old to reply)
Moose Man
20 years ago
Permalink
I'm a new user of LABVIEW and yessterday discovered something fairly interesting: there are no "momentary action" buttons or switches. (Nor, for that matter, does there appear any concept of sequential logic (i.e., D or T of JK flip flops). <br> Why would I want a momentary action switch? Well, I started writing a VI to prompt user for input (ultimately to be a file name in which input setup paraments are to be stored). Without a momentary action switch, the simple VI I wrote wants to keep on prompting for input until that switch is manually turned off with another mouse click. But I'm not able to enter the that mouse click because I am continually being asked for input. Clearly, a momentary action switch would solve that problem. Alternatively, a "pulse generating" function or "triggering" function would half way solve the problem (I'd still have to manually turn the switch back off, which wouldn't be all that great, but that's not aailable either.<br> My tentative conclusion here is that there you just can't do this sort of thing in Labview--- at the very least, that there is no concept of either "sequential logic" or synchrounous logic more generally, in Labview. I know that I could at least in theory make my own "flip flops" out of NAND gates. But of course, there still would be no "system clock." Perhaps what I need to do is create my own digital one shot multivibrator using LABVIEW NAND gates. Fine, but why can't Labview simply provide a "momentary action" switch? One that would be of sufficient "duration" to "prompt for user input" once, after the virtual panel switch was hit, and only once.
Gilberto Campos
20 years ago
Permalink
Post by Moose Man
I'm a new user of LABVIEW and yessterday discovered something fairly interesting: there are no "momentary action" buttons or switches. (Nor, for that matter, does there appear any concept of sequential logic (i.e., D or T of JK flip flops).
... cut ...



In LabView, the mouse right button is the path to (almost) all
solutions!!!!!
After placing your button on the front panel, right-click on it and
select "Machanical Action" to chose your 'momentarily action' mode.

In general, right clicking on a front panel object or a diagram function
will display a menu with everything there is abount that
object/function.


Another almost infinite source are the examples that ship with labview.
Click on Help\Find Examples and on the 'Search' tab of the examples
window type "mechanical"; you'll find the "Mechanical Action of
Booleans.vi" that will explain the behaviour of the 6 (six) working
modes of the boolean buttons.


If none of these help, a quick tour to the NI Developper Zone may bring
further insight. From your preferred browser surf to http://ni.com, type
your search string into the text box on the top-right corner and select
"NI Developper Zone" on the drop down box before.


Of course, writing on the NG is also valid - that's why I'm answering -
just a bit slower to get an answer!


Have fun!

G.
--
Posted via Mailgate.ORG Server - http://www.Mailgate.ORG
Ed Dickens
20 years ago
Permalink
Dennis and John already hit on the solution for your problem by using the Latch action for booleans, but I just wanted to comment on your comment of a function not being available.<br><br>[climbing on soapbox]<br><br>You're correct in the fact that there are no D or T of JK flip flop functions built into LabVIEW, but why should there be. NI provides all the building blocks you need to make one. If NI attempted to provide a function for everything you can think of, LabVIEW would be bloated with thousands of functions that you would probably never use. There's probably already a thousand functions/subVIs that come with LabVIEW, and I know I've most likely only used a fraction of them.<br><br>LabVIEW is a programming language. If you need something that doesn't exist, build it and save to your user.lib directory so you can use it again later. Look at it his way, if you needed to add two numbers, then multiply the sum by 100, would you expect a function that did that, or would you just drop an ?Add? and ?Multiply? function and connect them together.<br><br>I think part of the problem here is NI Marketing. They tout LabVIEW as so easy to use, anybody can do it. And for the most part, that?s true. But then people start thinking that everything they need will be there in convenient icon they?ll just have to drop and connect. And to some extent, that?s true. Even if you have to drop a few and connect them.<br><br>[/climbing off soapbox]<br><br>Since you?re new to LabVIEW, you should start off by running through the tutorial. Open the Help menu and select ?Search the LabVIEW Bookshelf?. This will open a PDF that is actually a searchable index of all the help available in LabVIEW. On the page that opens, you?ll see a section ?New Users? and a link ?Getting Started with LabVIEW?. This is a pretty good tutorial and will cover all the basics you need to know, like the different actions of the Boolean controls. There?s also a really good tutorial online at <a href="http://cnx.rice.edu/content/col10241/latest/" target=_blank>http://cnx.rice.edu/content/col10241/latest/</a>. <br><br>Also, attached are a couple VIs I found somewhere a while ago. One is a JK flip flop, the other is a D latch.<br><br>Ed


Boolean_Arch.zip:
http://forums.ni.com/attachments/ni/170/122383/1/Boolean_Arch.zip
Moose Man
20 years ago
Permalink
The first two messages were quite helpful. The tird one is two but in a very different sort of way. It's helpful in helping me to figure out why I'm having so much trouble with Labview. That is to say, I've done a number of tutorials with Labview. They're great except for one thing--- they really don't teach you how to do anything *new* or *innovative* in Labview. That is to say, they really don't tell you much of anything about the top level architecture of Labview--- they don't teach you how to "think about engineering problems" in a Labview environment. Instead, they hope to give youu enough "prefab" solutions to enough problems so that you'll be able to kludge together solutions to new problems. <br> This is a very old approach, dating back to antiquity. (The Great Roman Educator Quintillian (c.a. AD35-100) in his "Education of an Orator" complained about a new breed of orator that was taking the "canned solution" approach to arguing points (i.e., rote memorization of specific arguments. What Ronald Reagan did at the end of his last Presidential Debate, where he started reciting movie sript lines which he had memorized decades prior was an example of this approach. Ronnie did this not because he was getting senile, as some have supposed, but because he never was a deep thinker, but always did have a photographic memory). <br> In any case, I find this approach to be innefficient, stiff, and inflexible at best. What is worse, it makes Labview appear to be (rightly or wrongly) extremely counter intuitive in a number of ways. The matter of the momentary switch is but one example. Sure I'm glad to know how to do it, and I thank the two peopele who explained this to me, but please don't tell me this could have been either "figured out" or learned by way of any of the standard tutorials. <br> I also have Bishop's book, Labview 7 Express. Nice book if you're teaching a course in Labview. Also a good book if you want to get a good grade in a course on Labview. But for coming up to speed quickly in practical use of Labview, or as a reference guide--- worthless. I really wish this were not the case, but sadly, it is. I can also tell you that if I were to write such a book, it would be organized in a very different sort of way. The first chapter, for example, would discuss data reprentational forms in Labview (binary trees, flat strings, and so forth. I don't see how anyone could begin to understand Labview without knowing that). And the first example I'd give wouldn't be anything like anyn of Bishops. That is to say, I wouldn't just throw an example at someone and say "see, here is how it's done." Why not? Because such an approach doesn't really show you how to do anything. Instead, I would start with some simple application anyone can relate to--- like a VI that converted Farenheit to Centegrade. I would start by saying something like this: "Do solve this problem, first you have to give the user a means of entering a temperature to be converted, and a way of reading the result. You this in the Front Panel mode by... etc etc. Then, you have to create a function that actually does the conversion. That you do in Block Diagram mode by... Now, suppose you want to save the result in a file? That you do by..." In essence, this first example would be paradigmatic of the Labview design process-- one which more or less would be a simplified model for all designs. <br> As to my point about not having any sort of flip flops, I just can't begin to buy what you say. Anyone who has done any sort of logic design at all (even crude logic design with electro-mechanicaltimer relays--- the kind that used to plug into Octal Tube Sockets) knows that you need more than just combinational logic gates to do anything practical. Sure, I could pull out my old logic design texybook from 30 years ago, and refresh my memory of how to build a JK Flip f
John Rich
20 years ago
Permalink
I think that you missed the point of Ed's post. LabVIEW is a very powerful programming language with many built in functions. Unfortunately it would be impossible to have functions for everything that every user could ever want. However, since it is a programming language the building blocks are there. Furthermore, there are often solutions available if you look or ask for them. For instance, Ed attached a JK Flip-Flop vi to his post. <br><br>I have missed your point on the tutorials. It appears that you're saying that they don't teach you how to do anything new and innovative (JK Flip-Flops?) but then you say that a tutorial on how to convert Farenheit to Centigrade is how you would begin. I have found that the tutorials are very good for learning the basics of LabVIEW. To go beyond the basics you could take LabVIEW courses. Also, much can be learned by just tinkering around with LabVIEW and reading questions and replies on this forum. There are many of us who are self-taught LabVIEW programmers. Some are even involved in building large projects in LabVIEW. We didn't get there by expecting everything to be available in a canned solution. We began by learning the basics and then pushing the limits of knowledge. When we don't know we ask questions and usually get good solutions, often finding that someone else has already done what we're trying to do, or some variation thereof.<br><br>You asked how LabVIEW is designed. Well, its original application was in data acquisition and display. This is still a major role for LabVIEW, although new features have helped it to evolve into a much more versatile language.
Ed Dickens
20 years ago
Permalink
<blockquote><hr>I can also tell you that if I were to write such a book, it would be organized in a very different sort of way. The first chapter, for example, would discuss data reprentational forms in Labview (binary trees, flat strings, and so forth. I don't see how anyone could begin to understand Labview without knowing that). And the first example I'd give wouldn't be anything like anyn of Bishops. Instead, I would start with some simple application anyone can relate to--- like a VI that converted Farenheit to Centegrade. I would start by saying something like this: "Do solve this problem, first you have to give the user a means of entering a temperature to be converted, and a way of reading the result. You this in the Front Panel mode by... etc etc. Then, you have to create a function that actually does the conversion. That you do in Block Diagram mode by... Now, suppose you want to save the result in a file? That you do by..." In essence, this first example would be paradigmatic of the Labview design process-- one which more or less would be a simplified model for all designs.<hr></blockquote><br><br>The way you say you would write a book on the subject is pretty much how the <a href="http://sine.ni.com/nips/cds/view/p/lang/en/nid/2241" target=_blank>LabVIEW Basics 1&2</a> classes are setup. We start out by introducing the basic parts of LabVIEW (front panel, block diagram, controls and indicators, data types, etc..) then start with a simple exercise. Then throughout the class, we build on the original exercise to eventually build an entire application that really does something.<br><br><blockquote><hr>The bigger point I was trying to make was that Labview simply isn't designed for a "logic designer's" point of view. Fine. How then *is* it designed? Actually I hoped I might get an answer to that question out of my initial post Perhaps I still will. Perhaps I might even figuring out the answer to that question myself.<br><hr></blockquote><br><br>I think you missed my point. You're correct that LabVIEW isn't designed from "logic designer's" point of view. But I don't think it's advertised as such either. LabVIEW is a programming language just as C++ or Visual Basic is. It gives you all the pieces you need to do what need to do. Your job is to figure out which ones you need and what order to put them in. If it was designed to be a logic designers tool, then the person using it to build a vision inspection system for an assembly line would have a really hard time. Look at it this way, if C++ were written from a "word processors" writer point of view, a person trying to write a spreadsheet application would have a hard time. LabVIEW is no different except in how you create your application. Instead of lines of code, you use graphic symbols.<br><br>LabVIEW is a tool that anyone can use to build an application. There's a learning curve involved for the new user and we've all been there. When I started with LabVIEW about 5 years ago, I had the exact same problem you had with the boolean controls. How do I make a momentary switch? Like you I figured it out by asking the question. I eventually discovered that the LabVIEW Help is really one of the better Help systems that applications ship with. It's not perfect, none of them are. But there is a wealth of information in there if you look. And if you think something is missing, posting it here will get the attention of the folks at NI (yes, they do monitor this forum. Anybody whose name and participation bar appears in blue works at NI) that write the Help files and there's a good chance your suggestion will be implemented.<br><br>I hope that clears up your "How is it designed" question a bit. LabVIEW is used for many different types of applications. Just look at the number of <a href="http://www.ni.com/labview/" target=_blank>addon packages</a> available for LabVIEW and you'll get an idea of it capabilities.<br><br>Ed
Moose Man
20 years ago
Permalink
Let me try once more to clarify. But this is going to have to be it. There have been a lot of connections drawn here between things I've written that aren't really connected (like Tutorials and JK flip flops). First let me say, that my intention has not been to beat up on NI or Labview. NI is a good company run by some really nice folks, as far as I can tell. All of my interactions with the company have been extremely positive. My primary puropose, thus, is to learn how to use Labview well enough to make some money. There are other things I could do with my time that would be a lot more enjoyable than technical work, but I can't make a living doing them. Thus, I'm stuck with trying to make the best of using tools such as Labview, Matlab, and so forth. <br> Basically, my bacground is systems design, and logic design. I've done some analog, yes, but I'm mainly digital, and I'm mainly systems. <br> To start with the digital side of things, any university level digital logic design course is going to begin with cominatorial logic (Boolean expressions, Karnough maps, etc.). Then, once as a student masters all of that, and he thinks he's got it made, the professor points out to him that to do anything really useful, you need the concept of a "logic state." How do you implement logic states? With memory elements--- the most basic memory element being the flip flop (SR being then most primitive, as I recall). Then to define a total system, you decide on a number of specific "states." If say, you were designing a coin vending machine, one state would be machine empty, another "5 cents deposited," and so on. You would then move on to arbitrarily define specific states to represent specific conditions (where n is the number of flip flops, there are 2**n possible states of your system).<br> Anyway, if I were to design a hardwired logic (i.e., not using Labview) instrumentation panel, this is basically how I would do it. Of course, I'd be far more likely to use a microcontroller or "microprocessor" chip for most applications, but either way, I'd want to begin by defining a finite number of "states" of my system, and draw a flow diagram. If I wanted parallelism, I might use a Petri Net of some sort. <br> My point: this is clearly NOT the way Labview does things. My question was, therefore: if not this way, how *does* Labview structure and organize things (that is, if you hope to wind up with something better than what is commonly called spaghetti code, how do you do a top down design)? So far, nobody has even tried to ansswer that question. Instead, people have more or less dumped on me for asking the question--- accusing me of expecting to have every little thing done for me ahead of time. Well, the reality is just the opposite. I just want a few basic building blocks. In the world of logic design, flips are really quite basic. Which lead me to conclude: the sequential logic design paradigm is NOT a particully useful one for Labview. No, I don't want to try to change that fact by brute force, using JK flip flops implemented with NAND gates-- I want, instead, a more useful model of Labview as supplied by NI.
Dennis Knutson
20 years ago
Permalink
The "sequential logic design paradigm" is not the paradigm that LabVIEW or NI represents at all. The paradigm presented is dataflow with graphical programming elements. Data flow is much more than data flows in and data flows out. Put simply, data flow means a function does not execute until it's data is present. Unlike text based languages where you control the order of execution by the order of statements or goto's or function calls, execution order is determined by when data is present. This allows for inhereent parallel tasks in that if you don't connect functions with some sort of data, the functions will execute in parallel. LabVIEW is not so specialized that it only supports logic design. A logic design program can be created with LabVIEW just as you can create a logic design program in C++ or VB. A computer science class is much more applicable to thinking about LabVIEW than a digital design course. <br><br>It's unfortunate that you've found the book and the tutorials lacking. I can't comment on any because I've never read a beginners book or tried any of the tutorials. I firmly believe that there is no replacing a class with a real, live instructor. Many cities have classes so I would encourage you to try that route. If that's not possible, then try to find a local user group and of course, make use of this forum. <br><br>Lastly, let me point out a couple of things. LabVIEW ships with a document called the LabVIEW Development Guidelines. It discusses top-down and bottom-up design approaches and includes a style guide that should be read by everyone. And, since you mention state machines, this architecture is covered in a couple of shipping examples. There is also an add-on for LabVIEW in which you can graphically draw a state machine and then generate the code.
Moose Man
20 years ago
Permalink
As far as the tutorials and the one book I have by Bishop go, I'm just saying, flat out, to put it bluntly, that they're all really terrible. As someone who has *recently* tried to pick this all up, and as someone with a certain background in education generally, I think there is some validity to what i say. <br> First of all, if you're an engineer working in the field, you simply cannot afford the luxury of a 3 month course to learn Labview, nor the hours that would be required to work through Bishop's book on ones own. Secondly, it wouldn't be necessary to do that if there were a decent book available (it may well be that one is, I haven't had a chance to look at very many of them).<br> What's wrong with Bishop's book? To begin, Chapter 1, the worst in the book, is a complete mental overload. He throws a proverbial million things at you at once. Even worse, nothing is really explained in that chapter at all. As a result, the reader keeps rereading and rereading to try to figure out what he missed, and where he missed it, only to eventually realize he didn't really miss anything because there's really nothing there at all. That is to say, the chapter tells you nothing. Everything you could possibly hope to learn by reading these 39 pages you could learn by taking a minute or two to open the Labview program and simply clicking a a few icons. If there is one useful page in the whole chapter, it is the bottom of page 32 and all of 33, where a few key tems are defined, like Block Diagram and Front Panel. (He mentions nodes on page 33, but doesn't really tell you what they are--- again making the real feel frustrated for having missed something--- something which isn't really there). <br> The chapter claims, of course, to "introduce the Labview Environment." The reality is that it does nothing of the sort. In point of fact, it tells you absolutely nothing about "the Labview Environment." It does tell you a little about the Labview User Interface, but as any undergraduate computer science student knows, a user interface is not the same thing as a programming envirnoment. As one web definition for "programming environment" puts it, "A programming environment implies a particular abstraction of the computer system called a programming model."<br> So, what is the Programming Model of Labview? Is it a place/transition net? A colored Petri Net? What? This is the queation I have been asking for some number of days now. Bishop in Chapter 2, page 44, mentions in passing something about "data flow programming" But what does he then tell you this means? That data flows in, and it flows out. Wow. And I paid $65 for *this*? <br> I could perhaps reasonably come to the conclusion that Labview is little more than an amorphous blob of code, originally written by some analog guys,that people are just somehow able to use (mostly by memorizing little tricks). But I really don't believe that--- even though a good number of Labview users, and book authors, seem to approach it that way. I do believe however that Labview shows more signs of "evolution" than it does of "intelligent design." (Two phrases I have borrowed from a certain theological debate that happens to be raging right now--- I hope people appreciate it). That wouldn't be so bad if there were a really good tutorial out there somewhere, that could in an hour or two, get you off and running on real world projects. But if there is, I have yet to find it. I'm half tempted to try to write one myself, by as I said at the outset of this post, I have bills to pay. Writing such a tutorial would be but another "fun" thing I could do, but sadly wouldn't help pay my rent.
tbob
20 years ago
Permalink
I have not read Bishop's book, so it appears that you have wasted your $65. I am a self taught Labview programmer so I cannot recommend any book. As far as Labview's "Programming Model", what do you mean by programming model? I've never heard of a place/transition net or a colored Petri net. Labview is a programming language, not a hardware design tool. There is no simple structure that everyone follows. You create whatever you need. It is up to the programmer. I have programmed in lots of languages, Fortran, Pascal, C, VB, Assembly, and I can say that Labview is by far the easiest language to learn and to use. Not all of my programs follow the same structure. Not all of my projects start with top down design. It all depends on what you want to do. It takes experience to determine the best way to start and the best way to procede. Start with something simple until you understand what you can do with Labview. Look at some examples that come with Labview. I haven't taken it but I hear that NI's basic course is great. If you can't attend, you can buy the course material. If it is too expensive, try another book on learning Labview. I got off and running with Labview by reading the manual and looking at some example code that comes with Labview. It took me a few days to become proficient and it cost me nothing but time. When I started, I was already an experienced programmer. That helps a lot. You need to understand what a programming language is before you start attacking it. And if you attack Labview here, you can expect to get lots of reprocussion because we all love Labview. You will too once you understand what it is and what it can do.
tbob
20 years ago
Permalink
Ben, I don't see it as the bear and the moose vs the world. I just see that the moose is not understanding what a programming language is supposed to do. It's like hardware vs software. You can use digital logic in hardware to create a system. You can also use sequential logic in software to do similar things. Data flow is the same in both cases. The output is not valid until all data inputs are satisfied. I tried teaching Labview to a non programmer once, and he just could not grasp it. I had to start with simple programming concepts, like explaining what A=A+1 did. Mathematically speaking, A=A+1 is impossible, but programatically, everyone understands that you are adding 1 to whatever A "was", and storing the new value back to A. Once the moose understands programming, he will welcome Labview. As far as design methods, top down design is my favorite, but does not lend itself to every single situation. Yes I have seen some horrible approaches in my days, and it usually ends up in a mess. There are books out there that deal with this subject. Maybe someone could recommend a good book for the moose. Sorry, MooseMan, it will take time to get up to speed. There is no other way around it.
Ben
20 years ago
Permalink
I will let the Moose Man speak for himself if he chooses to re-enter.<br><br>My point is that there is a big part of programming that is not covered in any of the LV courses I am aware of.<br><br>The part that is missing is structure (maybe wrong term).<br><br>I have already said alot about design and will not repeat it now.<br><br>When I start a design for a project, I spend a good deal of time figure out all of the parts and how they will work together. After I complete that phase, I can proceed to the SDE work.<br><br>Some of the thing I ponder are similar to the thoughts that go into a database deign that is completely normalized (again maybe bad term).<br><br>What data am responsible for?<br><br>What uses what data when, and how often?<br><br>Other thoughts include WHILE questions like what happen while something else?<br><br>Durring this phase I draw-up fuzzy pictures and give things names like GIZMO_Watcher, GIZMO_Logger, etc.<br><br>The initial design functions are usually done use Power Point for the pictures and Excel for the what data where stuff.<br><br>Maybe this just my unique methodology....<br><br>But, I end up with relationships between my PP/Excel work anf the final VI.<br><br>Often the "tables" equate to arrays of clusters.<br><br>Widget_Watcher ends up being a VI developed via the SDE.<br><br>So....<br><br>My point there is still a lot of "top end" room for LV to grow into.<br><br>I'd like to see the full devlopement cycle from initail design onward to be supported by LV.<br><br>Just my dream,<br><br>Ben<br><br>BTW: Please do not take offense in my comments or pull any punches! This is a good topic and I can handle some mean jabs. Just don't insult the bear or threaten to shave him, etc.<p>Message Edited by Ben on <span class=date_text>05-23-2005</span> <span class=time_text>03:50 PM</span>
Ed Dickens
20 years ago
Permalink
<blockquote><hr>I think NI should include some basic information on programming that is read before anything else. I'd bet we see a significant drop in questions posted here. On the other hand, since it seems many don't read the manual anyway, maybe it wouldn't make any kind of difference.;)<hr></blockquote><br><br>In fact, NI does include this ability, at least as it pertains to LabVIEW.<br><br>Open your LabVIEW.ini file and look for the key "IsFirstLaunch=False". Change it to "True", save and close the file, then launch LabVIEW. A different dialog opens than the one we normally see, because when we first installed, we zipped past this one because we didn't need it. But look at it. Big bold letters "New to LabVIEW?". And a direct link to the "Getting Started" manual. This may not be the best manual in the world, but at least it's something.<br><br><img src="Failed to load image: http://forums.ni.com/attachments/ni/170/122636/1/First%20Time%20Dialog.png"><br><br>The problem is nobody (including myself) wants to sit and read anything. We'd rather just dive right in and get to work. Only after getting frustrated because we can't do what we want to do we resort to reading. By then we've turned off this dialog and there's no easy way to find it again. There's no link to it in the Help menu. You have to go to the Bookshelf to find it. And that's not very obvious to most. So you're right. It probably won't make any difference because that's human nature now. Do first, read later.<br><br><blockquote><hr>My point is that there is a big part of programming that is not covered in any of the LV courses I am aware of.<br><br>The part that is missing is structure (maybe wrong term).<hr></blockquote><br><br>The <a href="http://sine.ni.com/nips/cds/view/p/lang/en/nid/12769" target=_blank>LabVIEW Intermediate courses</a> actually attempt to teach LabVIEW programming structure. We go through Top Down and Bottom Up approaches as well as analyzing your application requirements, choosing the correct design pattern and data structures for your application, and a host of other topics. In this course, you develop a more complex application than the one you do in Basic 1&2 in a structured manner much more like what you encounter on the job. These are new courses and still need a little work. I've taught them a couple times and in order for the student to get the most from it, they should have several months of on the job work under their belt because it moves pretty fast. If you've just taken the Basics class and jump right into the Intermediate class, you'll probably be lost.<br><br>But that has nothing to do with the original topic of the thread. What is LabVIEW? One of Moose Mans questions was, <blockquote><hr>how *does* LabVIEW structure and organize things<hr></blockquote>. I think the answer is; it doesn?t. It only does what you tell it to do. And in order for it to what you want it to do; you have to learn how to manipulate the tools it provides. And unfortunately, there?s no shortcut for that. Only time and experience (maybe a class or two if you have the time and money) will get you there.<br><br><blockquote><hr>OK it looks like it the bear and the moose vs the world!<hr></blockquote><br><br>Good thing you don't use a squirrel for your icon. I'd be having flashbacks from many years ago. ;)<br><br>Ed<p>Message Edited by Ed Dickens on <span class=date_text>05-23-2005</span> <span class=time_text>04:49 PM</span><p>Message Edited by Ed Dickens on <span class=date_text>05-23-2005</span> <span class=time_text>04:51 PM</span>


First Time Dialog.png:
http://forums.ni.com/attachments/ni/170/122636/1/First Time Dialog.png
tst
20 years ago
Permalink
<blockquote><hr>That wouldn't be so bad if there were a really good tutorial out there somewhere, that could in an hour or two, get you off and running on real world projects. <hr></blockquote><br>I really *really* doubt that Is there a C++ tutorial that can get you up and running in a couple of hours?<br>If there is, please show me.<br>Seriously though, LV wasn't designed as a general purpose programming language. It did sort of "grow into it" over time, and still does so now, and there are many areas where it's lacking.<br>There's no getting around learning curves. You have to start small and you have to make the stupid mistakes because that's the way you learn. Now, if you said that you wanted to connect to an oscilloscope and display what's its showing on your screen then we could say "yes, you can do that fast", but something more complicated will take time.<br>It's quite possible that you're right and that there is no really good tutorial for starting out. Personally, I started by learning from someone and by working myself and having my mistakes explained to me. The rest of my LV knowledge comes from web sources like this site and from experience. I often send beginners <a href="http://cnx.rice.edu/content/col10241/latest/" target=_blank>Here</a> and <a href="http://zone.ni.com/devzone/learningcenter.nsf/03f7c60f17aad210862567a90054a26c/55974411828f779086256ce9007504bd" target=_blank>here</a> as two examples for tutorials and I tell them to read <a href="http://zone.ni.com/devzone/conceptd.nsf/webmain/CB5E46406090C61C86256A7000559B66" target=_blank>the LabVIEW style guide</a>. I haven't gone through these tutorials and it's likely they won't help you because they show how to start with LV and not how LV works, but you can try. The style guide is definitely worth reading.<br>I can't explain what you're asking about the LV programming model, because the description of "Dataflow" sounds perfectly natural to me. There are a few links you may wish to look at <a href="http://www.ni.com/labview/power.htm" target=_blank>here</a>.
Ben
20 years ago
Permalink
HI Moose Man,<br><br>I may be making a mistake by stepping into this thread now but....<br><br>1) I am on your side.<br><br>2) Once you get over the learning curve you are going to be in good shape.<br><br>I am still somewhat suprised to see just how few LV types understand top down design. They seem to bottom up into a mess pretty often.<br><br>This link talks about using the LV State Diagram Editor.<br><br>http://forums.ni.com/ni/board/message?board.id=170&message.id=112440#M112440<br><br>Feel free to post follow-up Q's or comments if you like.<br><br>If this does not get you any closer to where you want to go, please ask!<br><br>Ex-hardware Engineer (that got better),<br><br>Ben<br><br>PS. I beleive the new SDE tutorial uses a vending machine as one of the examples.<br><br>Post conflict detected! I did not read Dennis and tst before making my post. I still thnk the answer is to start with the SDE because the Moose Man is comfortable there.<br><br><p>Message Edited by Ben on <span class=date_text>05-23-2005</span> <span class=time_text>02:50 PM</span><p>Message Edited by Ben on <span class=date_text>05-23-2005</span> <span class=time_text>02:53 PM</span>
Loading...