Topic: declare keyword, type inference
Author: maxtal@extro.ucc.su.OZ.AU (John MAX Skaller)
Date: Thu, 1 Oct 1992 17:59:58 GMT Raw View
How do you feel about:
int x,y;
long z;
declare var=z+x*y;
declare const cvar=x/y;
that is, when 'declare' (or dcl for short like PL/1) is used,
the type is infered from the initialisation. Am I correct in
assuming that the type is always determined?
Would this be 'a good thing' or do you hate it? Does the syntax
actually work?
One advantage is that if, say 'z' above was made 'float' then
the types of var and cvar would change automatically. This
could be a disadvantage too.
One could also write
declare float x;
which is equivalent to the usual
float x;
--
;----------------------------------------------------------------------
JOHN (MAX) SKALLER, maxtal@extro.ucc.su.oz.au
Maxtal Pty Ltd, 6 MacKay St ASHFIELD, NSW 2131, AUSTRALIA
;--------------- SCIENTIFIC AND ENGINEERING SOFTWARE ------------------
Author: fjh@munta.cs.mu.OZ.AU (Fergus James HENDERSON)
Date: Fri, 2 Oct 1992 05:05:10 GMT Raw View
maxtal@extro.ucc.su.OZ.AU (John MAX Skaller) writes:
>How do you feel about:
>
> int x,y;
> long z;
>
> declare var=z+x*y;
> declare const cvar=x/y;
>
>that is, when 'declare' (or dcl for short like PL/1) is used,
>the type is infered from the initialisation. Am I correct in
>assuming that the type is always determined?
>
>Would this be 'a good thing' or do you hate it? Does the syntax
>actually work?
Not the dreaded... NEW KEYWORD!!
Well I guess that it doesn't *quite* clash with the macro "declare"
in <generic.h>, since that macro takes two parameters.
What about declaring references? Would it work like this?
declare& ref = x;
Actually, there is prior art in the form of an extension with similar
functionality in gcc, the "typeof" keyword. You could then do something like
#define dcl(var,expr) typeof(expr) var = (expr)
dcl(var,z+x*y);
to get what you are looking for, albeit with a very ugly syntax.
Seriously, I don't think that it is worth the effort.
Judicious use of typedefs achieves 90% of what you are after.
You've got a snowflake's chance in hell of getting it into the standard :-)
--
Fergus Henderson fjh@munta.cs.mu.OZ.AU
This .signature virus is a self-referential statement that is true - but
you will only be able to consistently believe it if you copy it to your own
.signature file!
Author: jbn@lulea.trab.se (Johan Bengtsson)
Date: 2 Oct 92 14:53:56 GMT Raw View
fjh@munta.cs.mu.OZ.AU (Fergus James HENDERSON) writes:
: maxtal@extro.ucc.su.OZ.AU (John MAX Skaller) writes:
:
: >How do you feel about:
: >
: > int x,y;
: > long z;
: >
: > declare var=z+x*y;
: > declare const cvar=x/y;
: >
: >that is, when 'declare' (or dcl for short like PL/1) is used,
: >the type is infered from the initialisation. Am I correct in
: >assuming that the type is always determined?
: >
: >Would this be 'a good thing' or do you hate it? Does the syntax
: >actually work?
:
: Not the dreaded... NEW KEYWORD!!
If this capablity was desired (and I disagree), then the sensible thing
would be to make the "inferenced type" the default type, instead of the
silly "default int" rule we are living with today.
var=z+x*y; // typeof(var) == typeof(z+x+y) (long)
const cvar=x/y; // typeof(cvar) == const typeof(x/y) (const int)
Anyway, what I _really_ wanted to say was this:
Let's get rid of the "default int" rule first, IMHO that is
far more important, since that rule seriously cripples
compilers ability to do sensible error reporting/recovery.
--
--------------------------------------------------------------------------
| Johan Bengtsson, Telia Research AB, Aurorum 6, S-951 75 Lulea, Sweden |
| Johan.Bengtsson@lulea.trab.se; Voice:(+46)92075471; Fax:(+46)92075490 |
--------------------------------------------------------------------------
Author: tmb@arolla.idiap.ch (Thomas M. Breuel)
Date: 4 Oct 92 01:36:54 GMT Raw View
In article <1992Oct1.175958.725@ucc.su.OZ.AU> maxtal@extro.ucc.su.OZ.AU (John MAX Skaller) writes:
How do you feel about:
int x,y;
long z;
declare var=z+x*y;
declare const cvar=x/y;
that is, when 'declare' (or dcl for short like PL/1) is used,
the type is infered from the initialisation. Am I correct in
assuming that the type is always determined?
C++ made a decision to go with overloading. Overloading doesn't
coexist well with type inference.
Would this be 'a good thing' or do you hate it? Does the syntax
actually work?
Type inference is important when you use very complicated data types
(often involving higher-order functions) and (Milner-style)
polymorphism, since without it, you'd spend most of your programming
effort on trying to come up with the right type for every variable.
But C++ is not very well-suited to expressing those kinds of problems
in the first place because of other limitations (e.g., no closures).
For many of the domains to which C++ is well suited (e.g., numerical
code involving linear algebra, complex numbers, geometrical objects),
I find that overloading is more useful than type inference.
Thomas.