# Programming language origins?



## MrKowz (Mar 9, 2011)

The weekly ponderable!

I've often thought, how are programming languages made?  Why does VBA know that For i = 1 to 10 means it has to use the variable i = 1, then i = 2, then i = 3, etc...  How does it even know *what* a variable is!?  How does it know what each character is meant to do?

We type all of these words into a programming editor, but have you ever wondered how really simple it is all made, even when it seems incredibly complex?  Would it be just a bajillion 1's and 0's?


----------



## Aladin Akyurek (Mar 9, 2011)

MrKowz said:


> The weekly ponderable!
> 
> I've often thought, how are programming languages made? Why does VBA know that For i = 1 to 10 means it has to use the variable i = 1, then i = 2, then i = 3, etc... How does it even know *what* a variable is!? How does it know what each character is meant to do?
> 
> We type all of these words into a programming editor, but have you ever wondered how really simple it is all made, even when it seems incredibly complex? Would it be just a bajillion 1's and 0's?


 
A good read would be:

Brookshear, J. G. (1985). Computer science: An overview. Benjamin/Cummings.


----------



## diddi (Mar 9, 2011)

the languages we use are just an interface to the real language that the cpu uses.  you are correct in saying that its just a bunch of 1s and 0s.   way back in the olden days, i used to program at assembler level which is one step above binary. cpus only understand binary, and it is the luxury of speed and memory that allows us to use text based editors, and then visual environments.

the vbe is an 'interpriter' firstly which interprits the english code and parses it for correctness (syntax) and when we choose Run, it 'compiles' the code so that the cpu can understand what is required.

the cpu can only do a small number of processes such as compare, add, AND, OR etc.  and the clever people at Intel hard code an instruction set onto the cpu that teaches it how to perform more complex tasks.  it takes literally hundreds of cpu instructions to perform even a simple task like loading an emply window.


----------



## chuckchuckit (Mar 10, 2011)

Some years ago I read a story about how computers originally were very hard to use. And a woman in the US Navy came up with the first programming language. Just recently I tried to find info on her and was not able to find anything about her.

If I remember right, she took the 1's and 0's language and changed it into something usable where commands could be given for computers to understand. It was a very interesting article but don't remember where I read it.

We men tend to be very task oriented and can categorize problems and solve them one at a time. Where women supposedly are more spaggetti oriented where their thoughts wander more to include the unrelated and perhaps irrelevant (according to us men anyway).

So that was why I found it interesting that it was a woman who was acknowledged as the first one to develop a computer language. I think she came up with the idea and implemented it to be an actual working reality. 1940's 1950's something like that. Maybe while the men were perfecting the 1's and 0's processing, she spaggettied out the process to include a more practical interface we men had not highly considered yet?

Would like to re-read things about that development if it can be found.

I often talk over my programming pursuits with my wife. And she goes off into what seems to be the irrelevant sometimes. But sometimes she brings things into the process that I have completely overlooked while I have been working away on perfecting a task, and am missing other important relavancies. It is a good balance sometimes.

Chuck


----------



## Easy-XL Support (Mar 10, 2011)

http://en.wikipedia.org/wiki/Grace_Hopper


----------



## chuckchuckit (Mar 10, 2011)

Yep - that's her. Good find.


----------



## T. Valko (Mar 10, 2011)

MrKowz said:


> The weekly ponderable!
> 
> I've often thought, how are programming languages made? Why does VBA know that For i = 1 to 10 means it has to use the variable i = 1, then i = 2, then i = 3, etc... How does it even know *what* a variable is!? How does it know what each character is meant to do?
> 
> We type all of these words into a programming editor, but have you ever wondered how really simple it is all made, even when it seems incredibly complex? Would it be just a bajillion 1's and 0's?


I've often contemplated this:

If you need to know a computer language to create a new computer language then how did they create the very first computer language?


----------



## chuckchuckit (Mar 10, 2011)

Yes - She tended to push the envelope on things a it. Went against the grain a bit so to speak. Sometimes that is what it takes to discover or develop something. Not simply accept things as they are, but ask why, and pursue the what if's. Develop that type of pattern of looking into things. Not do things a certaion way just because that is how they are always done. But pursue perhaps what is possible or what really are the facts.

I remember now where I read about her before. It was her obituary in "Investor Business Daily" in the section they had on Successful People. I don't get IBD anymore but I used to clip out their articles on Successful People or what ever it was called. And would reread them occassionally. And give them to my nephews when they were growing up.

As I wanted to understand better how many of them thought and how they went about things. There are patterns they had in common. And I found it very interesting and then try to apply those things when certain situations call for such.

Applies in programming too when one tries to do things perhaps not of the status quo so to speak. Or wants to develop something useful or needed.


----------



## diddi (Mar 10, 2011)

#NAME? said:


> I've often contemplated this:
> 
> If you need to know a computer language to create a new computer language then how did they create the very first computer language?



the first 'language' i used was 6502 but every step was manual.  as in i had a circuit board with a 0 - F keypad on it, a 6502 and some support chips, some 7-segment displays like an old calculator and about 512 bytes of memory.

one hand wrote the instructions in 'pseudocode'
eg  Call subroutine MySub

would have been

JFN MySub           which meant JumpFunction MySub

then one hand converted that to hexidecimal...

20 0F 10               (20 - the hex for JFN)    (0F 10   the memory location where MySub starts)

then when the whole program was done one keyed it in on the hex keypad.
IF it worked then you got it right, but invariable it took several goes to make it behave.


the first languages evolved from this by putting instruction groups together and saving them in a ROM and allowing the user to use a more 'english like' environment to access the code snippits.


Ah them was the days


----------



## PaddyD (Mar 10, 2011)

Keywords instead of a long answer:

- logical design of switching systems
- turing machine programme
- compilation

...long, though not necessarily good, answer available if you want one.


----------



## tweedle (Mar 11, 2011)

Oh gravy! I remember similar in my first round of robotics/logic. pick your TI chipsets, set the hex and pray for the correct set of red lights on the other side.


----------



## diddi (Mar 11, 2011)

@Tweedle
Did you ever build your own from TTL 'LS138s and a half dead EPROM from some other project?  And then you realise that Z80 had more registers so you start again... LOL
now those where the REAL days of computing...  

have a look at this for a nostalgia burst...
http://www.6502.org/homebuilt


----------



## Sandeep Warrier (Mar 12, 2011)

Hey Paddy

It'll be interesting to read the long answer!


----------



## tweedle (Mar 12, 2011)

Thanks for the link/stroll down memory lane diddi. 
I don't remember all the details, but I do remember a valuable lesson in assumptions.  Do not! assume your o-scope is calibrated correctly...heh heh heh.


----------



## ZVI (Mar 13, 2011)

MrKowz said:


> The weekly ponderable!
> 
> I've often thought, how are programming languages made?  Why does VBA know that For i = 1 to 10 means it has to use the variable i = 1, then i = 2, then i = 3, etc...  How does it even know *what* a variable is!?  How does it know what each character is meant to do?
> 
> We type all of these words into a programming editor, but have you ever wondered how really simple it is all made, even when it seems incredibly complex?  Would it be just a bajillion 1's and 0's?


You can find an example of the class module clsEquation.cls in BASLIBS.ZIP to see how it works in VB.



#NAME? said:


> I've often contemplated this:
> 
> If you need to know a computer language to create a new computer language then how did they create the very first computer language?


Sounds like egg and chicken paradox 

The similar question - can you write programs that write programs?
The known sentence from "C Programming Language" of K&R states that the C compiler is written in C.
Nowadays a plenty of compilers for programming languages are written directly in the target language.
This programming nonsense is based on "Self-hosting" or "Bootstrapping" 

One of the possible ways, in which human is a part of the compilation process, includes:
1. Writing of compiler’s source code using the target language
2. Manually translating it to another language
3. Compiling of the translated source code. This magic can be made even by hand!

Once the working compiler is made it can be used to compile the original sources directly, and manual translation is no longer necessary.

Compilers tend to require a wide range of algorithmic techniques and a variety of data structures.
Therefore if the language can implement a compiler it's likely to be useful for most other tasks as well.
The language, which is not powerful enough to write its own compiler, in some jargon is called as "My Favourite Toy Language" (MFTL)
and refers to anything (simple) that you can't write a compiler with


----------

