Types are central to EGL technology and are meaningful at development time, transformation time, and run time.
In general usage, a type such as integer or string identifies a set of values and a set of operations that can be applied to those values. For example, integers are whole numbers that can be added, subtracted, and so forth; and strings are character sequences that can be shortened, lengthened, and so forth. The number 5 is a value of an integer and is also said to be an instance of an integer. Similarly, the phrase “yes!” is a value and an instance of a string. In the most general sense, the terms “value” and “instance” are interchangeable.
In EGL, a field declaration is a statement that identifies a memory area available to your code. The area either contains or references an instance of a given type. For details on the distinction between containing and referencing, see “Value and reference types.”
// variable declarations myString STRING; // a string myInteger INT; // an INT, or integer myNumber DECIMAL(5,2) = 123.45; // a number with decimal point // constant declarations const MYSTRING01 STRING = "Value"; const MYINTEGER01 INT = 5; const MYNUMBER01 DECIMAL(5,2) = 123.45;
Each of the first set of statements declares a variable, which identifies a memory area that you can change in subsequent statements. Each of the second set of statements declares a constant, which identifies a memory area that you cannot change in subsequent statements.
For ease of coding, EGL is typically not case sensitive. Exceptions to this rule involve strings such as program names, which are in the source code and are also visible in the runtime system.