Topic: Is C++ another dinosaur?
Author: harrison@sp10.csrd.uiuc.edu (Luddy Harrison)
Date: 20 May 1994 13:02:58 GMT Raw View
John Max Skaller writes:
>>
>> Linkers ...
>> ...
>> Actually, I dont see this but the exact oppposite.
>> Far from the language design constraining implementations,
>> failure to see the archaic C linkers used by Unix are already
>> unsuitable for C++ without extreme hackery is impeding
>> upgrades to the language which more modern linkage technology
>> can already handle.
>> ...
>> I have no doubt at all that Unix will respond,
>> and more powerful linkers become commonplace.
and in another note, concerning inline funtions with external linkage:
>> If they must, then the compiler has to decide WHERE
>> to instantiate the function -- and it cant do that. Only
>> the linker can do it. And Unix linkers cant do it, they're
>> not smart enough.
It seems to me the the definition of "linker" is being stretched beyond
reason in this thread. We load the responsibility for instantiating
templates, inline functions, and so on, on the poor linker, and then
we complain that it is inadequate because it is not up to the task.
But if template and inline function instantiation are linking, then
lexing, parsing, code generation, and optimization are all linking,
and finally the linker contains all of the functionality of the compiler,
(and, incidentally, the functionality we used to call linking, namely
the joining together of relocatable object files into an executable).
Perhaps the linker needs to have more functionality to support C++.
For example, the question was raised whether linkers should evaluate
expressions that involve symbolic constants from more than one object
file. I have often wished that ld could pull individual symbols from
a .o file rather than including either the entire object file or
nothing (some DOS linkers can do that I'm told). This kind of
functionality belongs to the linker. But template and inline function
instantiation is plainly a language implementation problem and not
primarily a problem of linking. There is a solution to the problem
that uses a linker to discover symbols that are undefined; but a
solution can be built that does not use the linker at all. For
example, the compiler could emit a table of types and functions used
in each compilation unit, and another program could examine these
files and instantiate missing symbols until convergence.
-Luddy Harrison
Author: jbuck@synopsys.com (Joe Buck)
Date: Tue, 17 May 1994 17:54:51 GMT Raw View
dak@kaa.informatik.rwth-aachen.de (David Kastrup) writes:
>>Now C++ requires the computer to read a whole lot of bunch (include
>>files) and reparse them every time again when it is producing code.
rmartin@rcmcon.com (Robert Martin) writes:
>Unless you are using a compiler that uses "precompiled headers".
>Although I am not going to leap for joy at Borland's scheme for
>precompiled headers; if done right, it can speed up a compile by a
>factor of 3 or more.
Unfortunately, the design of C and C++ make it very difficult to do
precompiled headers in a general way. The problem is that #include
causes the included file to be inserted at that point in the processing of
the outer file, so that the parsing of that file depends on whatever
#define directives, typedefs, and previous declarations are in scope.
This makes it very difficult for a compiler to determine, given an
#include directive, that it can use a previously compiled form of the
included file.
It's too late to redesign the language, so I'm not proposing the following
seriously as a language extension, but rather as an idea to consider.
What if there were a slightly different directive, say, #import, as in
#import <iostream.h>
#import would be very similar to #include, with the following differences:
1. Any text, definitions, etc processed before the #import line would have
no effect on the processing of anything in the included file (iostream.h,
in the example).
2. A second #import of the same file would have no effect.
Note that precompiled headers now become trivial: since #import
<iostream.h> always adds the exact same set of definitions to the symbol
table, headers can be compiled completely independently of the files
that include them. Of course, headers could themselves use #import
to include other headers.
>>As soon as you do OOP programming in C++, the turnaround times
>>get incredibly bad.
>
>There is no doubt that compile times lengthen. Yet the tools are
>improving. The ability to precompile headers is an important boon.
>And good OOD will keep the number of #includes manageable.
Nevertheless, it is common to find, in larger projects, that you have
a number of classes that are used all over the place, that you need to
make a change in the private part of the implementation, and then to
find that 500-1000 different object files need to be recompiled. You
can partially prevent this by things like
class Foo {
private:
class FooImplementation& theRealThing; // an opaque class.
public:
// public interface goes here
};
but then you pay the price of an extra level of indirection and if you
need a new public function, you're screwed again.
I'm talking about systems with thousands of classes and hundreds of
thousands to millions of lines of code here. I'm not asking the committee
to solve these large-system problems, I'm only requesting more sensitivity
to them than I sometimes see expressed here. Implementers can handle
problems like coming up with databases and other tricks to reduce the
amount of recompilation, e.g. in the above, a new nonvirtual public
member doesn't really require recompilation of files that don't use the
new function. The committee should only worry if they've specified some
language feature that would *prevent* implementers from solving these
problems.
--
-- Joe Buck <jbuck@synopsys.com>
Posting from but not speaking for Synopsys, Inc.
***** Stamp out junk e-mail spamming! If someone sends you a junk e-mail
***** ad just because you posted in comp.foo, boycott their company.
Author: cbarber@bbn.com (Christopher Barber)
Date: 17 May 1994 23:18:46 GMT Raw View
>>>>> "DK" == David Kastrup <dak@rama.informatik.rwth-aachen.de> writes:
>>> John Farley (jfarley@cen.com) wrote:
>>> Anyway, the machines get faster every day. I don't think it
>>> matters what platform you are using today, in a couple of years
>>> the compile time will be somewhat irrelevant to you.
DK> No. Running time of the produced program will *always* be an
DK> important measure, not in as much as to the time problems done
DK> today will be run, but as to the sort of problems which will get to
DK> be doable at all.
Either you quoted the wrong part of the article or you just did not read
what was written. He clearly is talking about it not mattering that much
how long it takes to *compile* something, not about the performance of the
resultant executable.
- Chris
--
Christopher Barber
(cbarber@bbn.com)
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Wed, 18 May 1994 13:44:15 GMT Raw View
In article <nagleCpvMqt.9y7@netcom.com> nagle@netcom.com (John Nagle) writes:
>bs@alice.att.com (Bjarne Stroustrup) writes:
>>nagle@netcom.com (John Nagle @ NETCOM On-line Communication Services) writes
Linkers ...
>
> This can't really be put off as an "implementation issue". Most
>of the problems are implicit in the language design, and can't be fixed
>cleanly without some language changes.
Actually, I dont see this but the exact oppposite.
Far from the language design constraining implementations,
failure to see the archaic C linkers used by Unix are already
unsuitable for C++ without extreme hackery is impeding
upgrades to the language which more modern linkage technology
can already handle.
Without wishing to overstate the issue <grin>,
Unix is so far behind PC technology here it is threatening
to strangle progress. Since Unix is so "powerful" an operating system,
it should do better, not worse, than PC's, but problems of market
size and portability mean that only the creation of
a Standard --WITH requirements for smart linkage -- will
force Unix to catch up or be left out in the cold.
I have no doubt at all that Unix will respond,
and more powerful linkers become commonplace.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: mbk@inls1.ucsd.edu (Matt Kennel)
Date: 20 May 1994 21:01:46 GMT Raw View
John Max Skaller (maxtal@physics.su.OZ.AU) wrote:
: Without wishing to overstate the issue <grin>,
: Unix is so far behind PC technology here it is threatening
: to strangle progress. Since Unix is so "powerful" an operating system,
: it should do better, not worse, than PC's, but problems of market
: size and portability mean that only the creation of
: a Standard --WITH requirements for smart linkage -- will
: force Unix to catch up or be left out in the cold.
: I have no doubt at all that Unix will respond,
: and more powerful linkers become commonplace.
The dumb linkers are just fine---if you only use them as the last part
of an automated processes that does more.
For example, Eiffel and Sather compilers (and Modula-3? i don't know) do
"smart" linking, with global consistency checks, no-hassle parameterized
types and all that, and produce a final executable with the standard unix
linker.
What you have to give up is the notion that you can compile a single
source file into a single .o without knowing what the other source files
in the program are.
You tell the compiler "here are my source files---figure it out
for me." The compiler, and not make, decides what needs to be recompiled
and what doesn't.
I don't see any reason C++ couldn't work that way either, even using
a dumb linker at the end.
: JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
--
-Matt Kennel mbk@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".
Author: pjcreath@flagstaff.Princeton.EDU (Peter Janssen Creath)
Date: Fri, 13 May 1994 19:03:53 GMT Raw View
In article <CpFv2A.4Hv@ucc.su.oz.au>,
John Max Skaller <maxtal@physics.su.OZ.AU> wrote:
>In article <2qdljrINNgj1@early-bird.think.com> chase@Think.COM (David Chase) writes:
>>
>>So, unfortunately, I think we're stuck with the complete
>>declaration of the class.
>[deletion]
> Its NOT necessary that the private members be visible
>with this mechanism, the size can be recorded and visible without
>knowing the contents. Its just not according to the existing
>rules.
And then you could have [*GASP!*] run-time objects! (Oh, sorry, that's
Objective C...grin) Assuming your linker could handle the unresolved
methods, any object conforming to a basic standard interface could be
added rather conveniently...
But this is probably opening another 5-year can of worms...
--
------------------------------------------------------------------------
Peter J. Creath
peterc@gnu.ai.mit.edu (year-round)
pjcreath@phoenix.princeton.edu (academic year)
Author: nagle@netcom.com (John Nagle)
Date: Sun, 15 May 1994 01:24:32 GMT Raw View
bs@alice.att.com (Bjarne Stroustrup) writes:
>If I remember rightly, dinosaurs were the dominant class of animals
>for something like 100 million years. Even C++'s best friends wouldn't
>want C++ to last that long.
>However, C++ isn't going to disappear tomorrow either.
>Reading this and other threads gives me the impression that some of the
>posters could use some basic information about C++ and the aims of C++.
>I recommend:
> Bjarne Stroustrup:
> The Design and Evolution of C++
> Addison Wesley, ISBN 0-201-54330-3.
> March 1994.
I've read the book. The argument for compatibility with dumb linkers
remains weak. The scheme for automatic template instantiation described
therein, which describes a sort of "smart linker" apparently implemented
by calling Cfront and the C compiler from a front end program bolted onto
the linker, is an interesting experimental approach, but not what you want
in a production system.
Let's face it. You need a specialized C++ linker just to do a decent
job of linking C++ as currently defined in the ARM. That wasn't true of
early versions of C++, but it's true now.
John Nagle
Author: bs@alice.att.com (Bjarne Stroustrup)
Date: 15 May 94 17:34:32 GMT Raw View
nagle@netcom.com (John Nagle @ NETCOM On-line Communication Services) writes
> bs@alice.att.com (Bjarne Stroustrup) writes:
> >If I remember rightly, dinosaurs were the dominant class of animals
> >for something like 100 million years. Even C++'s best friends wouldn't
> >want C++ to last that long.
>
> >However, C++ isn't going to disappear tomorrow either.
>
> >Reading this and other threads gives me the impression that some of the
> >posters could use some basic information about C++ and the aims of C++.
> >I recommend:
> > Bjarne Stroustrup:
> > The Design and Evolution of C++
> > Addison Wesley, ISBN 0-201-54330-3.
> > March 1994.
>
> I've read the book. The argument for compatibility with dumb linkers
> remains weak.
I have noted that linker seems to be a primary interst of Mr. Nagle's,
but my comment about D&E and about C++ in general wasn't restricted
to linkers or their impact on C++. Linking issues were one influence on
the design of C++, but only one of many.
C++ lends itself to a variety of implementation techniques and several
have been used. The fact that it can be implemented using traditional
dumb linkers was extremely important during the first decade or so and
remains important to many.
In some ways, you can do better with C++ where you are willing to part
with the traditional linkers, but that doesn't imply that's the right
choice for everyone. Language design and implementation involves many
delicate tradeoffs and there is no one way to please everyone.
> The scheme for automatic template instantiation described
> therein, which describes a sort of "smart linker" apparently implemented
> by calling Cfront and the C compiler from a front end program bolted onto
> the linker, is an interesting experimental approach, but not what you want
> in a production system.
I describe and critize a couple of approaches. None, in my opinion, deserves
the description above.
> Let's face it. You need a specialized C++ linker just to do a decent
> job of linking C++ as currently defined in the ARM. That wasn't true of
> early versions of C++, but it's true now.
Sice Cfront 1.2, C++ needed some support beyond what most 1980 vintage linkers
supported. What constitutes a ``decent job of linking'' and `` what you
want in a production system'' is a matter for debate and certainly depends
on your expectations, what kind of work you do, and on what platforms.
- Bjarne
Author: rmartin@rcmcon.com (Robert Martin)
Date: Mon, 16 May 1994 20:10:22 GMT Raw View
dak@kaa.informatik.rwth-aachen.de (David Kastrup) writes:
>Now C++ requires the computer to read a whole lot of bunch (include
>files) and reparse them every time again when it is producing code.
Unless you are using a compiler that uses "precompiled headers".
Although I am not going to leap for joy at Borland's scheme for
precompiled headers; if done right, it can speed up a compile by a
factor of 3 or more.
>Let's face it: C (and C++) are nice for optimizing things yourself.
>They are a walking horror on legs for machine optimizations.
I prefer driving a clutch. ;-)
>Now, if you do real OOP programming in C++, and capsulate well,
>every include file (containing private class details, so you need
>to recompile everything after changes of *private* things) will
>have to be reparsed for every object file. *And* we will not get
>more than pretty local optimizations reliably after all.
Well, part of good OOD is the management of source code dependencies.
If done well, this limits the number of #includes in a header file.
So, you should not be including "every" header file.
Also, as far as privacy is concerned, the use of abstract base classes
is preferred so that changes to the private area do not propogate.
>As soon as you do OOP programming in C++, the turnaround times
>get incredibly bad.
There is no doubt that compile times lengthen. Yet the tools are
improving. The ability to precompile headers is an important boon.
And good OOD will keep the number of #includes manageable.
--
Robert Martin | Design Consulting | Training courses offered:
Object Mentor Assoc.| rmartin@rcmcon.com | Object Oriented Analysis
2080 Cranbrook Rd. | Tel: (708) 918-1004 | Object Oriented Design
Green Oaks IL 60048 | Fax: (708) 918-1023 | C++
Author: rmartin@rcmcon.com (Robert Martin)
Date: Mon, 16 May 1994 20:18:19 GMT Raw View
guerin@IRO.UMontreal.CA (Frederic Guerin) writes:
>John Farley (jfarley@cen.com) wrote:
>: Anyway, the machines get faster every day. I don't think it matters what
>: platform you are using today, in a couple of years the compile time will
>: be somewhat irrelevant to you.
>It depends on the relative growth of program size vs CPU power. .
>If I were you, I would not bet that CPU power will encompass program size
>so spectacularly. You know, we are all humans after all, and when we get
>more power, we always want many many ... many more things to be done.
>C'est la vie! :-) e
This is a good point. But things have gotten somewhat better over the
years. I can remember getting one turn around per day in the early
seventies. We would submit our decks for compile in the morning, and
get our listings back that afternoon.
In the eighties, I can remember waiting two or three hours for a
50,000 line program to compile (it was written in assembler). The
computer had to read it in from tape....
Later in the 80's I used to have to wait 15-20 minutes for a 2,000
line C program to compile and link. However, this was because the
PDP-11 that was compiling it was overloaded with everybody else's
compiles.
Nowadays, I wait 15 to 20 minutes for tens of thousands of lines of
C++ code to compile and link. I am not happy while I am waiting.
However, things are better than they used to be.
--
Robert Martin | Design Consulting | Training courses offered:
Object Mentor Assoc.| rmartin@rcmcon.com | Object Oriented Analysis
2080 Cranbrook Rd. | Tel: (708) 918-1004 | Object Oriented Design
Green Oaks IL 60048 | Fax: (708) 918-1023 | C++
Author: davidc@bruce.cs.monash.edu.au (David Chatterton)
Date: 15 May 1994 03:55:07 GMT Raw View
John Nagle (nagle@netcom.com) wrote:
: Let's face it. You need a specialized C++ linker just to do a decent
: job of linking C++ as currently defined in the ARM. That wasn't true of
: early versions of C++, but it's true now.
Is that such a bad thing? There are many features currently in the language
(and some ideas that have been thrown around in this group and others) that
would be easier to implement if a bit more time was spent compiling with a
specialized linker. I can accept that C++ should be compatible with C, but
is that also the case with the linker? (By the way I haven't read the book,
its not out here yet).
David
David Chatterton | "A new character has come on the scene (I am sure I did
Comp Sci Department, | not invent him, I did not even want him, though I like
Monash Uni, Clayton, | him, but there he came, walking out of the woods of
Australia, 3168. | Ithilien): Faramir, the brother of Boromir."
Phone: 03 905 5375 | - in a letter from JRR Tolkien to his son, 4 May 1944.
email: davidc@bruce.cs.monash.edu.au
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Fri, 13 May 1994 20:16:21 GMT Raw View
In article <CHASB.94May12164028@dme3.osf.org> chasb@osf.org (Chas. Bennett) writes:
>
>> fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
>> |> Backward compatibility.
>
>> Why? We never had it before.
>
>If backwards compatibility with C++ were the only problem, most
>of these issues wouldn't be issues.
>
>Backwards compatibility with C is the issue. While this is one
>of the major besetting evils of C++, it's also the major reason
>for it's success.
I wish I believed that. Retaining compatibility with C
causes SOME problems.
Retaining compatibility with existing C++ is the biggest
obstacle in the way of improving C++.
The reason is that C is a simple language, and not
much can go wrong with it -- and it has been
Standardised and in use for a while, so its stable,
well specified, and what faults it, and its Standard have are reasonably
well known and understood. However, faults in C++, or divergent interpretations
of it are VERY hard to correct. C++ could easily be
simplified, but some code relying on complications and tricks
would break.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: nagle@netcom.com (John Nagle)
Date: Mon, 16 May 1994 03:52:04 GMT Raw View
bs@alice.att.com (Bjarne Stroustrup) writes:
>nagle@netcom.com (John Nagle @ NETCOM On-line Communication Services) writes
> > I've read the book. The argument for compatibility with dumb linkers
> > remains weak.
>I have noted that linker seems to be a primary interst of Mr. Nagle's,
>but my comment about D&E and about C++ in general wasn't restricted
>to linkers or their impact on C++. Linking issues were one influence on
>the design of C++, but only one of many.
>choice for everyone. Language design and implementation involves many
>delicate tradeoffs and there is no one way to please everyone.
> > The scheme for automatic template instantiation described
> > therein, which describes a sort of "smart linker" apparently implemented
> > by calling Cfront and the C compiler from a front end program bolted onto
> > the linker, is an interesting experimental approach, but not what you want
> > in a production system.
>I describe and critize a couple of approaches. None, in my opinion, deserves
>the description above.
Author: dak@rama.informatik.rwth-aachen.de (David Kastrup)
Date: 17 May 1994 14:49:56 GMT Raw View
rmartin@rcmcon.com (Robert Martin) writes:
>guerin@IRO.UMontreal.CA (Frederic Guerin) writes:
>>John Farley (jfarley@cen.com) wrote:
>>: Anyway, the machines get faster every day. I don't think it matters what
>>: platform you are using today, in a couple of years the compile time will
>>: be somewhat irrelevant to you.
No. Running time of the produced program will *always* be an important
measure, not in as much as to the time problems done today will be run,
but as to the sort of problems which will get to be doable at all.
And when your compiler, having lots of time to consider, will be inable
to do good optimizations, what good is a good computer to you?
By the way, I'd like an OS which knows how to keep itself busy, for example
by examining the more often used code pieces (especially in libraries)
and try to optimize them.
I really think compilation/optimization should not be as static as it is
now: the computer should optimize more frequently used code passages
a lot more thoroughly than, say, exception codes. How to find out
except by running things?
Apart from future considerations:
If on the same platform C++ is much more unwieldy than another language,
and runs more slowly, you cannot say "Ok, I'll get a faster platform".
On the faster platform, C++ will still be slower than its competitor.
It is like saying, "Slow thinkers are as good at chess as fast ones.
They can just arrange longer times." Only that in those longer times
faster thinkers will get even more done.
Should we introduce a golf-like handicap for C++ evaluation?
--
David Kastrup dak@pool.informatik.rwth-aachen.de
Tel: +49-241-72419 Fax: +49-241-79502
Goethestr. 20, D-52064 Aachen
Author: kanze@us-es.sel.de (James Kanze)
Date: 17 May 1994 16:21:17 GMT Raw View
In article <CpHuD3.4Gs@ucc.su.OZ.AU> maxtal@physics.su.OZ.AU (John Max
Skaller) writes:
|> In article <JASON.94May7032003@deneb.cygnus.com> jason@cygnus.com (Jason Merrill) writes:
|> >>>>>> John Nagle <nagle@netcom.com> writes:
|> >
|> >> If you're willing to front-end a C compiler to get a C++ compiler,
|> >> what's so bad about front-ending the linker to get a C++ linker?
|> >
|> >Most C++ compilers *do* have special linker support; g++ has collect2 or
|> >GNU ld, Cfront has its collect2-like munger, xlC and Borland C++ have
|> >linkers that merge common pieces of the text segment. However, it seems to
|> >me that late binding of object sizes is outside of the realm of linker
|> >support,
|> Why? Its not that hard to do. Why shouldn't the size simply
|> be an external reference? Some linkers can add multiple externals
|> togther, others cant. It is possible.
While adding (or in this case, maybe multiplying) externals together
to get a link-time constant is not particularly difficult, using such
a constant to dimension a static variable (array) is a bit more
complicated.
On the other hand, I cannot see where this should pose a problem for
local (on stack) arrays.
--
James Kanze email: kanze@lts.sel.alcatel.de
GABI Software, Sarl., 8 rue du Faisan, F-67000 Strasbourg, France
Conseils en informatique industrielle --
-- Beratung in industrieller Datenverarbeitung
Author: dak@kaa.informatik.rwth-aachen.de (David Kastrup)
Date: 12 May 1994 12:36:57 GMT Raw View
jfarley@cen.com (John Farley) writes:
>> Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>> : The real problem (IMHO) is requiring C++ programs which use
>> : a particular class to #include the complete (including privates)
>> : declaration of the class.
>IMHO, the REAL problem is that C++ compiles are slow. A couple of years
>ago I worried about this a *LOT*. I used all of the tricks like pointers
>to a private (hidden) class that contained the implimentation data. I
>was extremely reluctant to make a change to a class interface, whether
>public or private. I did whatever it took to avoid the massive recompile
>problem. One day as I was complaining bitterly about the LANGUAGE, a
>friend pointed out that "There's nothing wrong with the language, if your
>compiles were 10 times faster you wouldn't worry about this at all!".
I once tried cutting down a tree with a knife.
IMHO, the REAL problem is that clocks run too fast. A couple of years
ago I worried about this a *LOT*. I used all the tricks like jiggling
the knife as fast as possible, and frequently sharpening it. I
was extremely reluctant to venture back into the real world, for fear
the cuts would grow shut again. I did whatever it took to avoid the
massive regrowth problem. One day as I was complaining bitterly about
the TOOL, a friend pointed out that "There's nothing wrong with the tool,
if your clock was running 10 times slower, or the tree was 10 times
smaller".
Ok, back from fun corner. When a language requires high horsepower
to produce something about as efficient and fast like a Basic
compiler on a C64 does, something's wrong. When I have a work horse
of a computer spend a whole lot of time on a compilation, I want
it to spend the time on making good, efficient code, and using that
additional time for optimizations.
Now C++ requires the computer to read a whole lot of bunch (include
files) and reparse them every time again when it is producing code.
The parser needs to be pretty complicated, a "simple" LRLL parser
will not do (see the typedef problem of types declared after use
in class interfaces). The parser will swallow loads of time more
than other parsers.
And if we get to the point where we have still processing time to
spare, then we cannot do many good optimizations because the crummy
pointer concept from C will never guarantee us that a variable does not
change suddenly over a function call where we did not expect it.
Let's face it: C (and C++) are nice for optimizing things yourself.
They are a walking horror on legs for machine optimizations.
Now, if you do real OOP programming in C++, and capsulate well,
every include file (containing private class details, so you need
to recompile everything after changes of *private* things) will
have to be reparsed for every object file. *And* we will not get
more than pretty local optimizations reliably after all.
As soon as you do OOP programming in C++, the turnaround times
get incredibly bad.
And look what the conversion rules have got us:
We have a lot of rules trying to make arguments match calls, so that
we can write something like
complex z= 3.0 + complex(0,2);
without having operator+(double, complex).
Unfortunately, this means that if (for efficiency's sake) we do *add*
such an operator, we get dozens of expressions which will apply to
this operator as well, because of the conversion match.
And this means that you really cannot know which expressions might
compile at all when using complex (depends on what was provided),
and, even worse, in what way, as the compiler cannot really know
which conversions are value-preserving, which not, which should
be preferred and so on.
C++ aims at the OOP market, and it starts getting some things right.
That no other language yet has the same popularity, or the renown
to be efficient while "OOP", does not change the fact that C++
achieves far too little with far too much work.
I myself prefer programming C++ over C (quite), but it does not change
the fact that C++ is inefficient as hell in its language design:
Far too complicated and obfuscated for that what it achieves.
And the solution to that, I think, is not to acquire faster computers.
--
David Kastrup dak@pool.informatik.rwth-aachen.de
Tel: +49-241-72419 Fax: +49-241-79502
Goethestr. 20, D-52064 Aachen
Author: sjc@netcom.com (Steven Correll)
Date: Thu, 12 May 1994 19:43:51 GMT Raw View
>>> Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>>> : The real problem (IMHO) is requiring C++ programs which use
>>> : a particular class to #include the complete (including privates)
>>> : declaration of the class.
This is not only expensive not only during compilation, but also during
linking and debugging. In the worst case, almost every compilation winds
up including declarations for the entire inheritance hierarchy and generating
debugging symbols for them. The linker (or a tool like Microsoft's CVPACK)
must spend I/O time to read the same symbols over and over, and spend
execution time hashing them to eliminate duplications. In a language which
separates interface from implementation and which does not rely on simple
textual inclusion via a preprocessor, the problem goes away: the compiler
generates symbols once for an exporter and need not regenerate them for
each importer.
I'm afraid that making C++ resemble Ada, Modula, or Fortran90 in this regard
would constitute a pretty radical change at this late date, however.
--
Steven Correll == PO Box 66625, Scotts Valley, CA 95067 == sjc@netcom.com
Author: tarheit@jupiter.cse.utoledo.edu (Tim Arheit AKA Fool Boy)
Date: Fri, 13 May 1994 00:13:22 GMT Raw View
Steven Correll (sjc@netcom.com) wrote:
: >>> Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
: >>> : The real problem (IMHO) is requiring C++ programs which use
: >>> : a particular class to #include the complete (including privates)
: >>> : declaration of the class.
: This is not only expensive not only during compilation, but also during
: linking and debugging. In the worst case, almost every compilation winds
: up including declarations for the entire inheritance hierarchy and generating
: debugging symbols for them. The linker (or a tool like Microsoft's CVPACK)
: must spend I/O time to read the same symbols over and over, and spend
: execution time hashing them to eliminate duplications. In a language which
: separates interface from implementation and which does not rely on simple
: textual inclusion via a preprocessor, the problem goes away: the compiler
: generates symbols once for an exporter and need not regenerate them for
: each importer.
This does not have to be the case. SAS C (for example) lets you compile
all the headers you want into a GST (Global symbol table) so it eliminate
the necessity of reading all the header files and parcing them again. It
speeds up compilation many times. I have heard that djgpp (dos port of gcc)
had this ability but it was disabled because it had some bugs in it.
btw, the I/O time spent reading in includes and source really only accounts
for 5-10% of the compilation time (for moderate sized code) in the compilers
I've seen tested (djgpp was one of them).
-tim
--
------------------------------------------------------------------------------
Tim Arheit "He prayeth best who loveth best all things
tarheit@jupiter.cse.utoledo.edu both great and small" --Coleridge
Author: chasb@osf.org (Chas. Bennett)
Date: 12 May 1994 20:40:27 GMT Raw View
> fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
> |> Backward compatibility.
> Why? We never had it before.
If backwards compatibility with C++ were the only problem, most
of these issues wouldn't be issues.
Backwards compatibility with C is the issue. While this is one
of the major besetting evils of C++, it's also the major reason
for it's success.
chasb
--
-------------------------------------------------------------------------------
Chas. Bennett Contract Software Engineer Boston, MA
chasb@world.std.com
#include "electron.h"
#include "colors.h"
Electron sge;
sge.color( green );
sge.scream();
-------------------------------------------------------------------------------
Author: sfc@datascope.com (Steve F. Cipolli (P/M))
Date: Thu, 12 May 1994 15:55:21 GMT Raw View
John Max Skaller (maxtal@physics.su.OZ.AU) wrote:
: In article <9412814.10061@mulga.cs.mu.OZ.AU> fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
: >>in C++ which result from NOT having a module system. Overhauling
: >>the language definition to solve those problems is certain
: >>to delay Standardisation.
: >
: >But you will still have to solve those problems anyway, unless
: >you abandon backward compatibility.
: However, there is not the need to have the 'deprecated'
: solutions be as general or for them to stand for the next few
: decades of programming.
: After all, programmers have lived without a One Definition
: Rule (the ARM rule is meaningless), and Standardising an overly
: restrictive rule is less of a problem if an alternate mechanism
: exists.
: The simple fact is that 'backwards compatibility' has
: already been thrown out in several cases --
: a) "new" throws exceptions
: b) pointers aren't safe
: c) template specialisation is changed
: d) const an non-const pass by value made indistinguishable
: e) binding non-const references to temporaries banned
: f) cast-away-const banned (use mutable)
: and I see no reason why a suitable module system should not
: be added to the list. PROVIDED ITS DONE NOW. When programmers
: _expect_ to have to upgrade to the Standard.
: There will be SERIOUS and WELL FOUNDED objections to not providing
: compatibility with a Standard -- either the existing ISO C Standard
: or the C++ Standard (when it eventuates).
: But the very existence of the committee is because existing
: practice is divergent. Standardising it is a PROMISE to
: break at least all but one divergent interpretation,
: and often breaking the lot is the best possible solution.
: --
: JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
: Maxtal Pty Ltd, CSERVE:10236.1703
: 6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
: NSW 2131, AUSTRALIA
I whole heartedly agree. The process to standardization must allow for
experimentation. Some experiments will fail. If the C++ committee is
unwilling to step back from a failure and correct the mistakes while they
can, the standardization problem won't matter much, because there will be
a new committee, fostering a new language to correct the problems that were
not solved before standardization.
Those that would argue that changes to the C++ language must be backward
compatible, simply mis-read the packaging. The language is not yet a
standard, it is continuing to evolve. The popularity of a language does not
decide its completeness, the acceptance by a standards body does.
Stephen Cipolli
Datascope Corp.
These opinions are mine alone.
The world can not afford another language standardization.
Author: jfarley@cen.com (John Farley)
Date: 13 May 1994 15:31:37 GMT Raw View
dak@kaa.informatik.rwth-aachen.de (David Kastrup) writes:
> I want it to spend the time on making good, efficient code,
> and using that additional time for optimizations.
> ...
> The parser will swallow loads of time more than other parsers.
> ...
> And if we get to the point where we have still processing time to
> spare, then we cannot do many good optimizations because ...
> ...
> As soon as you do OOP programming in C++, the turnaround times
> get incredibly bad.
For a guy who doesn't believe in faster computers, you sure worry a
lot about performance and optimization! You miss my point completely.
None of these issues affect me. I have modern fast machines, the compilers
are plenty fast, and the turnaround times are extremely fast. The hardware
guys have already optimized the machines to the point where software
optimization becomes a second or third order effect.
I don't write compilers for a living. I build large complex systems using
C++. The language does an excellent job of allowing me to concentrate on
solving application domain problems with elegant implimentations. Frankly,
my biggest problem with C++ isn't a language feature... it's the lack of
standardization. How many times have you seen this in comp.lang.c++:
gcc does this:
...
bcc does this:
...
cfront does this:
...
which is right?
As a full-time user of C++, I would much rather have the compiler writers
of the world concentrating on getting the CURRENT feature set to work
correctly, than to spend their time changing the the compilers to meet new
language features that solve yesterdays problems! I especially don't want
> I once tried cutting down a tree with a knife...
>
> I used all the tricks like jiggling the knife as fast as possible, ...
>
> I did whatever it took to avoid the massive regrowth problem...
And the lessons that you should have learned from your tree-cutting episode
are:
1) Use a LARGE sharp metal blade, not a small sharp metal blade
2) If you insist on using a knife, at least get an electric one that
jiggles 10 times faster.
> ... C++ is inefficient as hell in its language design:
> Far too complicated and obfuscated for that what it achieves.
Perhaps you should be spending your time on comp.std.newOOL
--
jfarley@cen.com -> johnFarley
Author: sjc@netcom.com (Steven Correll)
Date: Fri, 13 May 1994 18:17:46 GMT Raw View
>: >>> Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>: >>> : The real problem (IMHO) is requiring C++ programs which use
>: >>> : a particular class to #include the complete (including privates)
>: >>> : declaration of the class.
>Steven Correll (sjc@netcom.com) wrote:
>: This is not only expensive not only during compilation, but also during
>: linking and debugging...In a language which
>: separates interface from implementation and which does not rely on simple
>: textual inclusion via a preprocessor, the problem goes away: the compiler
>: generates symbols once for an exporter and need not regenerate them for
>: each importer.
In article <CppsMB.A7M@utnetw.utoledo.edu>,
Tim Arheit AKA Fool Boy <tarheit@jupiter.cse.utoledo.edu> wrote:
>This does not have to be the case. SAS C (for example) lets you compile
>all the headers you want into a GST (Global symbol table) so it eliminate
>the necessity of reading all the header files and parcing them again...
Borland ("precompiled headers"), and Microsoft offer similar features. The
limitation of these schemes is that because the language relies on textual
inclusion instead of explicitly separating interface from implementation, they
fail if the user is not careful to avoid conditional compilation, code
fragments within included files, undefining and redefining symbols, varying
the order of inclusion, etc. (Translation: you have to turn the feature off
to make the compiler conform to the standard.) My experience is that
precompiled headers are very beneficial, but fragile: if you slip up,
mysterious things happen to you. And (perhaps wisely) neither Borland nor
Microsoft relies on this feature to reduce the link-time expense.
--
Steven Correll == PO Box 66625, Scotts Valley, CA 95067 == sjc@netcom.com
Author: guerin@IRO.UMontreal.CA (Frederic Guerin)
Date: Fri, 13 May 1994 14:46:39 GMT Raw View
John Farley (jfarley@cen.com) wrote:
: Anyway, the machines get faster every day. I don't think it matters what
: platform you are using today, in a couple of years the compile time will
: be somewhat irrelevant to you.
It depends on the relative growth of program size vs CPU power. .
If I were you, I would not bet that CPU power will encompass program size
so spectacularly. You know, we are all humans after all, and when we get
more power, we always want many many ... many more things to be done.
C'est la vie! :-) e
: --
: jfarley@cen.com -> johnFarley
Frederic G.
Author: bs@alice.att.com (Bjarne Stroustrup)
Date: 14 May 94 02:22:29 GMT Raw View
If I remember rightly, dinosaurs were the dominant class of animals
for something like 100 million years. Even C++'s best friends wouldn't
want C++ to last that long.
However, C++ isn't going to disappear tomorrow either.
Reading this and other threads gives me the impression that some of the
posters could use some basic information about C++ and the aims of C++.
I recommend:
Bjarne Stroustrup:
The Design and Evolution of C++
Addison Wesley, ISBN 0-201-54330-3.
March 1994.
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sat, 7 May 1994 21:00:10 GMT Raw View
In article <2qf3h8INN9r8@watt.cs.unc.edu> leech@cs.unc.edu (Jon Leech) writes:
>In article <nagleCpEF6K.859@netcom.com>, John Nagle <nagle@netcom.com> wrote:
>> If you're willing to front-end a C compiler to get a C++ compiler,
>>what's so bad about front-ending the linker to get a C++ linker?
>
> Different problem. Cfront generates code that works with any
>sufficiently robust C compiler, with only size/alignment properties of the
>target architecture as parameters. How could you possibly generate a linker
>frontend that's portable?
Easily. You just generate "C" code in your "object" files.
You link them textually to get a single text file, then compile
that with a C compiler.
More practically, you generate both .o file and .description
files. The .description files are used by the linker "front end".
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: pete@genghis.interbase.borland.com (Pete Becker)
Date: Mon, 9 May 1994 15:21:58 GMT Raw View
In article <CpDC9E.15G@ucc.su.oz.au>,
John Max Skaller <maxtal@physics.su.OZ.AU> wrote:
>In article <2qatrp$9g9@scus1.ctstateu.edu> s3900120@scus1.ctstateu.edu (Student 20) writes:
>
>>include file;
>>
>
> With modules, we dont need a One Definition Rule.
>We dont need inline functions. We dont need to write separate
>interfaces. We dont have macro's zapping the meaning
>of header files. We get order of magnitude speed
>increase for compilation.
>
> What else do you want?
>
A cure for the common cold, batteries that last forever, and world
peace. Modules are no more likely to solve these problems than they are to
solve the claimed problems of C++. Unless, of course, you're willing to throw
out all currently existing C++ code. If not, then the language definition must
deal with defining what the language as currently used means.
-- Pete
Author: nagle@netcom.com (John Nagle)
Date: Mon, 9 May 1994 16:04:06 GMT Raw View
maxtal@physics.su.OZ.AU (John Max Skaller) writes:
>In article <2qf3h8INN9r8@watt.cs.unc.edu> leech@cs.unc.edu (Jon Leech) writes:
>>In article <nagleCpEF6K.859@netcom.com>, John Nagle <nagle@netcom.com> wrote:
>>> If you're willing to front-end a C compiler to get a C++ compiler,
>>>what's so bad about front-ending the linker to get a C++ linker?
>>
>> Different problem. Cfront generates code that works with any
>>sufficiently robust C compiler, with only size/alignment properties of the
>>target architecture as parameters. How could you possibly generate a linker
>>frontend that's portable?
> Easily. You just generate "C" code in your "object" files.
>You link them textually to get a single text file, then compile
>that with a C compiler.
That would work, but linking would be slow.
Worth considering is an approach to linking where the compilers
generate some intermediate form (maybe like Windows P-code) and the
actual code generation occurs at link time, when full information is
available. With cacheing of generated code by the linker, this could
be efficient, and with a project-file system like the Think products
for the Mac, it could be invisible to the user.
With an approach like this, you could have fully automatic template
instantiation with duplicate elimination, based on the code actually
generated. This would fold multiple instances of the same template which
turn out to generate the same code for different types. You could also
have cross-module inlining, automatic ordering of static constructors,
and similar global optimizations.
John Nagle
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sat, 7 May 1994 15:30:09 GMT Raw View
In article <2qdljrINNgj1@early-bird.think.com> chase@Think.COM (David Chase) writes:
>
>So, unfortunately, I think we're stuck with the complete
>declaration of the class.
Probably, but dont be too sure.
There's no real technical reason I can think of that
we cant define:
include "unit";
to include the external interface of "unit". Current C++ rules
REQUIRE private members to be visible. There's an argument
both for and against that. Bjarne made the decision a long
time ago, and its part of a fundamental principle of access control
in C++.
Its NOT necessary that the private members be visible
with this mechanism, the size can be recorded and visible without
knowing the contents. Its just not according to the existing
rules.
Those rules are one of the FEW places where a sound
principle is used to derive the language rules. Its not the
only possibility, but we shouldn't mess with it unless we're
really sure an alternative is superior.
I'm not at all sure Bjarne made the right decision.
Which means I'm not at all sure he made the wrong one either.
So I'm prepared to live with his decision. I'm glad he
made it, one way or the other, because at present, I couldn't.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sat, 7 May 1994 15:43:36 GMT Raw View
In article <nagleCpEF6K.859@netcom.com> nagle@netcom.com (John Nagle) writes:
>
> Acceptance of dumb linkers is warping the whole language
>out of shape.
'Fraid I agree. The problem is that some things like
template already require 'pre-archaic-linker-processing' to be done.
And once we have that, there's whole family of restrictions in
C++ that ought to be thrown out -- because compatibility with
archaic linker technology is the one (no longer sustainable) argument
for these 'features'.
What more, its these 'features' that are delaying the Standard
and causing all manner of problems for the committee. Issues such
as linkage, inlines, One Definition Rule, template instantiation,
specialisation, in-class initialisation .. all sort of things
are the way they are for a reason that is no longer reasonable.
One half of the language is no longer in synch with the
other. Its as if it were being redesigned by a schitzophrenic
with split brain syndrome.
> If you're willing to front-end a C compiler to get a C++ compiler,
>what's so bad about front-ending the linker to get a C++ linker?
The question is, is there any alternative AT ALL?
(Short of banning templates?)
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: martelli@cadlab.sublink.org (Alex Martelli)
Date: Mon, 09 May 1994 16:08:05 GMT Raw View
nagle@netcom.com (John Nagle) writes:
...
:>>what's so bad about front-ending the linker to get a C++ linker?
:> Different problem. Cfront generates code that works with any
:>sufficiently robust C compiler, with only size/alignment properties of the
:>target architecture as parameters. How could you possibly generate a linker
:>frontend that's portable?
:
: It's easier today. There are only three formats that really matter;
:Microsoft/Intel, Apple, and UNIX. If you get them covered, you have
:well over 90% of computing.
"UNIX", unfortunately, is NOT one "linker format". COFF, ELF, HP's
proprietary stuff, and so on at nauseam.
Alex
--
Email: martelli@cadlab.systemy.org Phone: ++39 (51) 6130360
CAD.LAB s.p.a., v. Ronzani 7/29, Casalecchio, Italia Fax: ++39 (51) 6130294
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sun, 8 May 1994 17:10:14 GMT Raw View
In article <JASON.94May7032003@deneb.cygnus.com> jason@cygnus.com (Jason Merrill) writes:
>>>>>> John Nagle <nagle@netcom.com> writes:
>
>> If you're willing to front-end a C compiler to get a C++ compiler,
>> what's so bad about front-ending the linker to get a C++ linker?
>
>Most C++ compilers *do* have special linker support; g++ has collect2 or
>GNU ld, Cfront has its collect2-like munger, xlC and Borland C++ have
>linkers that merge common pieces of the text segment. However, it seems to
>me that late binding of object sizes is outside of the realm of linker
>support,
Why? Its not that hard to do. Why shouldn't the size simply
be an external reference? Some linkers can add multiple externals
togther, others cant. It is possible.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sun, 8 May 1994 17:37:31 GMT Raw View
d88-jwa@dront.nada.kth.se (Jon W tte) writes:
>fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
>
>>[discussion of adding a module system to C++]
>>> What else do you want?
>
>>Backward compatibility.
>
>I don't see how backwards compatibility comes into it; the usage
>of a module system would be orthogonal to the current headers-and-
>sources inclusion strategy.
>
>Old files do #include, and it stil works. Your module could still do it
>when you build it. New files do include modules, and compile much faster
>with less unneccesary re-compiles.
Fergus's point is valid. One of the reasons for adopting
a module system is that the current #include mechanism DOESNT WORK.
>
>The problems we have with the current C++ definition are of the type
>"things you have to live with" and will probably NOT be solved for
>non-module compilations.
The current problems with the C++ "definition" are that
programmers THINK they know what the definition is, but if they
read the Working Paper or ARM carefully, they'd find they were
relying on intuition, logic, experience and "suck it and see".
Formalising the rules is hard enough -- agreeing
on the formalised rules is another thing -- especially
when 'Standardising' divergent interpretations is a guarrantee
_someones_ compiler will be broken.
[In fact, most vendors are very responsible about this.
Some are TOO responsible, sticking to the specifications when they're
clearly inadequate.]
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sun, 8 May 1994 17:26:40 GMT Raw View
In article <9412814.10061@mulga.cs.mu.OZ.AU> fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
>maxtal@physics.su.OZ.AU (John Max Skaller) writes:
>
>[discussion of adding a module system to C++]
>
>> What else do you want?
>
>Backward compatibility.
With what? BC3.1? BC4.0? MS? GNU? Cfront? HighC? Watcom?
IBM? HP? Edison Design? Symantec? The ARM??
>
>> If we do it, we can solve about 6 major problems
>>in C++ which result from NOT having a module system. Overhauling
>>the language definition to solve those problems is certain
>>to delay Standardisation.
>
>But you will still have to solve those problems anyway, unless
>you abandon backward compatibility.
However, there is not the need to have the 'deprecated'
solutions be as general or for them to stand for the next few
decades of programming.
After all, programmers have lived without a One Definition
Rule (the ARM rule is meaningless), and Standardising an overly
restrictive rule is less of a problem if an alternate mechanism
exists.
The simple fact is that 'backwards compatibility' has
already been thrown out in several cases --
a) "new" throws exceptions
b) pointers aren't safe
c) template specialisation is changed
d) const an non-const pass by value made indistinguishable
e) binding non-const references to temporaries banned
f) cast-away-const banned (use mutable)
and I see no reason why a suitable module system should not
be added to the list. PROVIDED ITS DONE NOW. When programmers
_expect_ to have to upgrade to the Standard.
There will be SERIOUS and WELL FOUNDED objections to not providing
compatibility with a Standard -- either the existing ISO C Standard
or the C++ Standard (when it eventuates).
But the very existence of the committee is because existing
practice is divergent. Standardising it is a PROMISE to
break at least all but one divergent interpretation,
and often breaking the lot is the best possible solution.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: jfarley@cen.com (John Farley)
Date: 11 May 1994 13:55:31 GMT Raw View
> Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
> : The real problem (IMHO) is requiring C++ programs which use
> : a particular class to #include the complete (including privates)
> : declaration of the class.
IMHO, the REAL problem is that C++ compiles are slow. A couple of years
ago I worried about this a *LOT*. I used all of the tricks like pointers
to a private (hidden) class that contained the implimentation data. I
was extremely reluctant to make a change to a class interface, whether
public or private. I did whatever it took to avoid the massive recompile
problem. One day as I was complaining bitterly about the LANGUAGE, a
friend pointed out that "There's nothing wrong with the language, if your
compiles were 10 times faster you wouldn't worry about this at all!".
These days I do my development on an HP-735 computer (no product endorsement
intended, there are other fast machines). The compiles are at least 10 times
faster (1 source file/sec on average,YMMV), and I don't worry about the
LANGUAGE problem at all. I spend my time getting the classes right, not
worrying about the effect on compile time.
My advice is, instead of trying to change the language, think seriously about
changing your development environment. If you are working in the UNIX world,
there are now plenty of FAST machines, and FAST compilers. In the PC world,
Pentium/PCI, PowerPC, NT on RISC platforms may do the trick.
Anyway, the machines get faster every day. I don't think it matters what
platform you are using today, in a couple of years the compile time will
be somewhat irrelevant to you. Getting a change in the language spec approved,
and getting it into the compilers and working the bugs out is sure to take
MUCH longer.
--
jfarley@cen.com -> johnFarley
Author: chase@Think.COM (David Chase)
Date: 11 May 1994 18:42:36 GMT Raw View
|> maxtal@physics.su.OZ.AU (John Max Skaller) writes:
|> [discussion of adding a module system to C++]
|> > What else do you want?
fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
|> Backward compatibility.
Why? We never had it before. We never, ever, had a trouble-free
transition from one release of C++ to the next, and the language
badly needs real live interface files.
David
Author: brett@wv.mentorg.com (Brett Stutz @ PCB x5574)
Date: Wed, 11 May 1994 21:29:27 GMT Raw View
In article <2qqo4j$bti@paperboy.gsfc.nasa.gov>, jfarley@cen.com (John Farley) writes:
|> > Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
|> > : The real problem (IMHO) is requiring C++ programs which use
|> > : a particular class to #include the complete (including privates)
|> > : declaration of the class.
|>
|> IMHO, the REAL problem is that C++ compiles are slow. A couple of years
Well, recompilation may be OK if you have a homogenous (probably fairly small)
organization. But it's a pain to build class libraries in C++ that don't
allow the clients to see private internals of the classes. Furthermore, it's
really tough to build a class library such that I can modify the (private) aspects
of the implementation without requiring clients to recompile. If I'm shipping an
application using multiple libraries on different rev cycles, I have to recompile
and ship a new release every time one of the libraries is changed.
--
----------------------------------------
Brett C. Stutz, Object Wrangler
Mentor Graphics Corporation
PCB Division
1001 Ridder Park Drive
San Jose CA 95131
(408)451-5574
brett_stutz@mentorg.com
Opinions expressed are my own, not MGC's
----------------------------------------
Author: svv@phoenix.dev.macsch.com (Suresh Vaidyanathan)
Date: Tue, 3 May 1994 23:34:26 GMT Raw View
In article <nagleCp5IsM.6EE@netcom.com> nagle@netcom.com (John Nagle) writes:
>d88-jwa@dront.nada.kth.se (Jon W tte) writes:
>>In C++, you have to let class clients know the PRIVATE interface
>>of the class, and changing the private interface (i e implementation)
>>may trigger a LARGE re-compile.
>
>>I used to think that was no big deal, but I'm starting to grow
>>really tired of it.
>
Yeah.
Why not have a "definition" and "declaration" distinctions for classes too?
The declaration specifies the interface and advertizes all public attributes
and methods, and the definition specifies the implementation.
.
.
.
Ok grumpy, you may "bark" now :-) !
Author: pete@genghis.interbase.borland.com (Pete Becker)
Date: Tue, 3 May 1994 23:53:19 GMT Raw View
In article <Cp92tF.5Mu@draco.macsch.com>,
>
>Why not have a "definition" and "declaration" distinctions for classes too?
>The declaration specifies the interface and advertizes all public attributes
>and methods, and the definition specifies the implementation.
Where would you put the definition of an inline function? Especially
if the inline function calls a private member function?
-- Pete
Author: jason@cygnus.com (Jason Merrill)
Date: Wed, 4 May 1994 02:06:51 GMT Raw View
>>>>> Suresh Vaidyanathan <svv@phoenix.dev.macsch.com> writes:
> Why not have a "definition" and "declaration" distinctions for classes too?
See section 9.1c (Interfaces) of the ARM.
Jason
Author: ej070@cleveland.Freenet.Edu (Ron Rossbach)
Date: 4 May 1994 12:52:25 GMT Raw View
The real problem (IMHO) is requiring C++ programs which use
a particular class to #include the complete (including privates)
declaration of the class. As mentioned before, other languages
like Ada get this right; a user should only need to #include the
public interface.
I realize this creates problems for object sizing, but they are
fixable, I would think.
Author: mbk@inls1.ucsd.edu (Matt Kennel)
Date: 5 May 1994 01:06:15 GMT Raw View
Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
: The real problem (IMHO) is requiring C++ programs which use
: a particular class to #include the complete (including privates)
: declaration of the class. As mentioned before, other languages
: like Ada get this right; a user should only need to #include the
: public interface.
: I realize this creates problems for object sizing, but they are
: fixable, I would think.
I personally think that C++ ought to make #include and the
use of "header files" completely optional, and obsolete.
--
-Matt Kennel mbk@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".
Author: spitzak@mizar.usc.edu (William Spitzak)
Date: 4 May 1994 20:57:37 -0700 Raw View
ej070@cleveland.Freenet.Edu (Ron Rossbach) writes:
>The real problem (IMHO) is requiring C++ programs which use
>a particular class to #include the complete (including privates)
>declaration of the class. As mentioned before, other languages
>like Ada get this right; a user should only need to #include the
>public interface.
>I realize this creates problems for object sizing, but they are
>fixable, I would think.
First, it would be really helpful to be able to define "extra"
non-virtual methods in the implementation. Basically, if X is
a class you can at any time declare a new function with the name
X::foo, whether or not it is in the class definition. The resulting
function is exactly the same as if you declared it in the private
part of the class. You cannot redefine a virtual function this
way, that results in an error. I'm not sure what putting "static"
in front of the function should do.
I think this is entirely safe, because the resulting function is
private and thus can only be called by declared methods of the
class. Yes, any source file can declare one, but without changing
the class definition they can't call it!
More elaborate enhancement:
Allow the token "..." to appear in Class definitions so you can give
them in multiple parts. "..." must appear immediately after the open
'{' or before the closing '}' or in both places.
A closing ... means the definition is incomplete. Unless more of the
definition is found, it is illegal to construct one of these objects
(meaning no automatic or static variables and no "new") and you cannot
take the sizeof the object. However you can declare pointers and
references to these objects and use and declare functions that take
and return pointers and references. Inline functions declared here
work fine, too. Inline constructors are not allowed.
An opening "..." means this is a continuation of a previous definition
ended with "...". This token's purpose is just to allow better
error checking. The resulting class is exactly the same as though
the previous and new definitions were concatenated together. Notice
that the continuation itself can end with "..."
The already supported C++ syntax of "class foo;" indicates an empty
initial incomplete defintion. The syntax "class foo {...};" is a
terminating one to indicate that nothing, in fact, is added to the
class. The silly-looking "class foo {... ...};" is how you make an
empty intermediate definiton.
--
__ ______ _______ ______________________________
|_)o|| (_ ._ o|_ _ _.| spitzak@mizar.usc.edu
|_)||| __)|_)||_ /_(_||<
----------|------------------------------------------
Author: kevlin@wslint.demon.co.uk (Kevlin Henney)
Date: Thu, 5 May 1994 10:08:19 +0000 Raw View
In article <2q85qa$638@usenet.INS.CWRU.Edu>
ej070@cleveland.Freenet.Edu "Ron Rossbach" writes:
>The real problem (IMHO) is requiring C++ programs which use
>a particular class to #include the complete (including privates)
>declaration of the class. As mentioned before, other languages
>like Ada get this right; a user should only need to #include the
>public interface.
>
>I realize this creates problems for object sizing, but they are
>fixable, I would think.
I would check up on your Ada if I were you.
The problems are only fixable with an extra level of indirection:
Cheshire Cat classes. C++ is value based, so statically determined
object size is _very_ important. See the ARM for reasons behind
having to include private stuff with the public stuff.
--
Kevlin Henney
In DOS, I know not why I am so sad,
It wearies me; you say it wearies you;
But how I caught it, found it, or came by it,
What stuff 'tis made of, whereof it is born,
I am to learn;
The Merchant of Venice (I.i)
Author: stt@spock.camb.inmet.com (Tucker Taft)
Date: Thu, 5 May 1994 14:01:08 GMT Raw View
In article <768132499snz@wslint.demon.co.uk>,
Kevlin Henney <Kevlin@wslint.demon.co.uk> wrote:
>In article <2q85qa$638@usenet.INS.CWRU.Edu>
> ej070@cleveland.Freenet.Edu "Ron Rossbach" writes:
>>The real problem (IMHO) is requiring C++ programs which use
>>a particular class to #include the complete (including privates)
>>declaration of the class. As mentioned before, other languages
>>like Ada get this right; a user should only need to #include the
>>public interface.
>>
>>I realize this creates problems for object sizing, but they are
>>fixable, I would think.
>
>I would check up on your Ada if I were you.
Ada does "get it right" w.r.t. functions. One may define additional
functions (aka "subprograms") in the body of a package, and there is
no recompilation required by any clients. (There are various
ways to accomplish the same thing in C++ somewhat less directly,
using friends, etc.) In Ada 9X, one may also define "private child units"
as well, which can be separately compiled, but which are only visible
to the body and descendants of the package.
As far as private data members, it is possible to declare variables
in Ada 9X whose size is not known at compile-time, either by
using discriminants, perhaps to control the length of an array
inside the record, or using a "class-wide" type initialized by
copying from some existing object.
E.g.:
Using discriminants:
type Text(Max : Natural) is record
Data : String(1..Max);
end record;
N : Natural := Get_Some_Value(...);
X : Text(Max => N); -- Size not known at compile-time
Using variables of a class-wide type:
-- An abstract type Set used as the root of a class of sets:
type Set is abstract tagged private;
function Empty return Set is abstract;
procedure Take(Elem : out Element; From : in out Set) is abstract;
procedure Insert(Elem : in Element; Into : in out Set) is abstract;
-- a class-wide conversion routine; can convert between
-- two different representations of a set
procedure Convert(Source : in Set'Class; Target : out Set'Class) is
Copy_Of_Source : Set'Class := Source;
-- Size not known at compile-time
Elem : Element;
begin
Target := Empty;
while Copy_Of_Source /= Empty loop
Take(Elem, From => Copy_Of_Source);
Insert(Elem, Into => Target);
end loop;
end Convert;
The local variable "Copy_Of_Source" is a copy of the input parameter Source,
based on the "run-time" type of Source, with no "truncation."
Such a variable whose size (and "run-time" type) is determined by its
initial value is useful as a temporary, as illustrated above.
>The problems are only fixable with an extra level of indirection:
True, but the compiler can provide the level of indirection itself,
and still follow stack discipline to provide automatic reclamation
of the storage for the local variable.
>... C++ is value based, so statically determined
>object size is _very_ important.
Ada 9X is also value based, but the programmer is allowed to
declare variables (such as arrays or class-wide variables) whose
size is not known at compile-time. A separate "secondary" mark/release
stack or the equivalent of "alloca()" can be used to implement such
variables quite efficiently.
> ...
>Kevlin Henney
S. Tucker Taft
Intermetrics, Inc.
Cambridge, MA 02138
Author: nagle@netcom.com (John Nagle)
Date: Thu, 5 May 1994 16:53:25 GMT Raw View
mbk@inls1.ucsd.edu (Matt Kennel) writes:
>Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>: The real problem (IMHO) is requiring C++ programs which use
>: a particular class to #include the complete (including privates)
>: declaration of the class. As mentioned before, other languages
>: like Ada get this right; a user should only need to #include the
>: public interface.
>: I realize this creates problems for object sizing, but they are
>: fixable, I would think.
>I personally think that C++ ought to make #include and the
>use of "header files" completely optional, and obsolete.
This is coming with CORBA.
John Nagle
Author: s3900120@scus1.ctstateu.edu (Student 20)
Date: Thu, 5 May 1994 13:55:05 GMT Raw View
In article <2q9gq7$3nm@network.ucsd.edu>,
Matt Kennel <mbk@inls1.ucsd.edu> wrote:
>Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>
>: The real problem (IMHO) is requiring C++ programs which use
>: a particular class to #include the complete (including privates)
>: declaration of the class. As mentioned before, other languages
>: like Ada get this right; a user should only need to #include the
>: public interface.
>
>: I realize this creates problems for object sizing, but they are
>: fixable, I would think.
>
>I personally think that C++ ought to make #include and the
>use of "header files" completely optional, and obsolete.
>
I can't see how, without going to global (unchecked) scoping - the
cure would be worse than the disease.
I've been recently looking at a new C/C++ derived language called
Parasol, and despite certain difficulties (not the least of which being the
limited documentation and the currently-being-solved problem of being limited
to a single platform and OS), it looks quite good. The language (which was
described in _DDJ_ Nov 1993) solves the problem in a manner similar to Eiffel
and Ada, except that there is no seperate declaration header. That is, external
declarations are implicit in public declarations; if a method or instance
is declared public in a public class, then it is the interface both inside and
outside the module. There is no preprocessor; the statement
include file;
does not (despite the syntax similarity) physically include the 'file', but
rather is a statement to the linker to find the relevant externals, like in
Ada. One of the advantages is that it simplifies the declaration syntax
enormously (the change to a more Pascal-like syntax is in fact the largest
divergence from the C family of languages, and in cleans it up substantially,
IMHO). Going to this sort of thing in C++ would require a major overhaul,
however. I wish this language were more available, since I think that (despite
the altered syntax and intentional elimination of multiple overloading within
a single class (a good decision) and operator declarations) Parasol has the
simplicity and rigor to be a worthy successor to C and C++.
Trouble rather the|Schol-R-LEA;2 ____ | Finagle is Ghod, and
tiger in his lair |Test Subject \bi/ | Murphy is His Prophet!
than the scholar |R&D \/ |
Author: tiemens@dutiag.twi.tudelft.nl (O. Tiemens)
Date: Fri, 6 May 1994 08:21:12 GMT Raw View
In article <nagleCpC9L2.1IE@netcom.com>, John Nagle <nagle@netcom.com> wrote:
>mbk@inls1.ucsd.edu (Matt Kennel) writes:
>>Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>>I personally think that C++ ought to make #include and the
>>use of "header files" completely optional, and obsolete.
>
> This is coming with CORBA.
>
> John Nagle
Could you elaborate on that?
Regards,
Oscar
Author: je@unix.brighton.ac.uk (John English)
Date: Thu, 5 May 1994 14:54:15 GMT Raw View
ej070@cleveland.Freenet.Edu (Ron Rossbach) writes:
: The real problem (IMHO) is requiring C++ programs which use
: a particular class to #include the complete (including privates)
: declaration of the class. As mentioned before, other languages
: like Ada get this right; a user should only need to #include the
: public interface.
Errm, no. You still have to put private stuff in Ada package specs:
package X is
type Y is private;
...
private
type Y is record ... end record;
end X;
: I realize this creates problems for object sizing, but they are
: fixable, I would think.
That's why Ada requires it in the package spec. Anyway, what's wrong with
class X {
friend class Xprivate;
public:
...
private:
Xprivate* privateData;
};
and then access your private data via a pointer? The class Xprivate can
then be declared and defined in the .CPP file so that you can change it
without recompiling everything. Okay, it's an extra indirection but IMHO
that's a small price to pay for the extra flexibility. A more serious
difficulty is that the Xprivate object has to be created using new in the
X constructor, which can cause problems if your compiler doesn't support
exceptions.
--
-------------------------------------------------------------------------------
John English | Thoughts for the day:
Dept. of Computing | - People who live in windowed environments
University of Brighton | shouldn't cast pointers
E-mail: je@unix.brighton.ac.uk | - In C++ only your friends can access your
Fax: 0273 642405 | private parts
-------------------------------------------------------------------------------
Author: kevlin@wslint.demon.co.uk (Kevlin Henney)
Date: Fri, 6 May 1994 10:17:52 +0000 Raw View
In article <CpC1Lx.1DD@inmet.camb.inmet.com>
stt@spock.camb.inmet.com "Tucker Taft" writes:
>In article <768132499snz@wslint.demon.co.uk>,
>Kevlin Henney <Kevlin@wslint.demon.co.uk> wrote:
>
>>In article <2q85qa$638@usenet.INS.CWRU.Edu>
>> ej070@cleveland.Freenet.Edu "Ron Rossbach" writes:
>
>>>The real problem (IMHO) is requiring C++ programs which use
>>>a particular class to #include the complete (including privates)
>>>declaration of the class. As mentioned before, other languages
>>>like Ada get this right; a user should only need to #include the
>>>public interface.
>>>
>>>I realize this creates problems for object sizing, but they are
>>>fixable, I would think.
>>
>>I would check up on your Ada if I were you.
>
>Ada does "get it right" w.r.t. functions. One may define additional
>functions (aka "subprograms") in the body of a package, and there is
This is nothing new and has nothing whatsoever to do with my point
about data type sizing. Most languages with a concept of translation
unit allow hidden definitions - even C with static.
>no recompilation required by any clients. (There are various
>ways to accomplish the same thing in C++ somewhat less directly,
>using friends, etc.) In Ada 9X, one may also define "private child units"
>as well, which can be separately compiled, but which are only visible
>to the body and descendants of the package.
>
>As far as private data members, it is possible to declare variables
>in Ada 9X whose size is not known at compile-time, either by
>using discriminants, perhaps to control the length of an array
>inside the record, or using a "class-wide" type initialized by
>copying from some existing object.
[example code deleted]
I was referring to _real_ Ada ;-)
>>The problems are only fixable with an extra level of indirection:
>
>True, but the compiler can provide the level of indirection itself,
>and still follow stack discipline to provide automatic reclamation
>of the storage for the local variable.
>
>>... C++ is value based, so statically determined
>>object size is _very_ important.
The following, however, is relevant:
>Ada 9X is also value based, but the programmer is allowed to
>declare variables (such as arrays or class-wide variables) whose
>size is not known at compile-time. A separate "secondary" mark/release
>stack or the equivalent of "alloca()" can be used to implement such
>variables quite efficiently.
I wasn't aware of this in Ada 9X. This seems like a reasonable solution
for a number of languages, but not C++: the definition of and
constraints on the sizeof operator are the key here. Yes, C++ is
weighed down by its C heritage, but that is also one of its strengths.
Take that away and it is no longer C++ and we are then discussing a
different language with very different design criteria.
>S. Tucker Taft
>Intermetrics, Inc.
>Cambridge, MA 02138
>
--
Kevlin Henney
In DOS, I know not why I am so sad,
It wearies me; you say it wearies you;
But how I caught it, found it, or came by it,
What stuff 'tis made of, whereof it is born,
I am to learn;
The Merchant of Venice (I.i)
Author: chase@Think.COM (David Chase)
Date: 6 May 1994 14:52:43 GMT Raw View
Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
: The real problem (IMHO) is requiring C++ programs which use
: a particular class to #include the complete (including privates)
: declaration of the class.
Matt Kennel mbk@inls1.ucsd.edu wrote:
> I personally think that C++ ought to make #include and the
> use of "header files" completely optional, and obsolete.
I agree. One of the major difficulties with include files is that
they are just textually included and subject to macro substitution
themselves (and, in a common idiom, included multiple times under
different macro settings). Proper, separate interface files let
you either pre-parse them into a more efficient form, or to write a
"compiler-server" that runs in background with all the interfaces
parsed in-core. IN THEORY, you could try something like this for
C++, where you might index the different expansions of the include
file by the settings of macros relevant to the preprocessor but it
requires more work, and checking the macro settings takes time as
well. If you can avoid the textual substitution troubles, the task
is much more straightforward (Mick Jordan wrote a compiler-server
for Modula-3 years ago, taking almost no time to complete the job,
and it worked wonderfully well).
As to hiding the sizes of objects, I agree that would be a good
idea, but I'm not sure that would be acceptable to most C++
programmers. It either slows down execution (extra lookups),
or requires extra linker support (late binding of object sizes),
or requires SELF-style on-the-fly compilation (really late, and
lazy, binding of object sizes). One reason people program in
C++ is because they think their code will run faster than some
of the other OO alternatives, so the first choice is not good.
Another reason people program in C++ is so that they will be
able to run on a wide range of standard-ish platforms and
interoperate with C, so the second choice is not good (linkers
seem to be a perennial obstacle, I do not know why). On-the-fly
compilation is cool, I think, but probably not well-enough proven
to be widely accepted.
So, unfortunately, I think we're stuck with the complete
declaration of the class.
David Chase, speaking for myself
Thinking Machines Corp.
Author: psu@cs.cmu.edu (Peter Su)
Date: Fri, 6 May 1994 16:42:18 GMT Raw View
In article <2qdljrINNgj1@early-bird.think.com>,
David Chase <chase@Think.COM> wrote:
>(linkers seem to be a perennial obstacle, I do not know why). On-the-fly
>compilation is cool, I think, but probably not well-enough proven
>to be widely accepted.
>
This is really odd. I mean, the UNIX linker is about the oldest, most
decrepit tool in the whole UNIX universe having only recently been
educated enough to support dynamic loading on some platforms.
This is a really pitiful state of affairs. I mean, even *Macs*, that
bane of programming talent everywhere, have incremental linking.
Not to mention cool programming environments that do minimal linking
until actual stand-alone application generation time. I don't see
whay UNIX can't do similar things.
'Course, I suppose, who wants to write another linker. Not me.
Pete
Author: chase@Think.COM (David Chase)
Date: 6 May 1994 17:35:28 GMT Raw View
In article <Cp93ow.nA@borland.com>, pete@genghis.interbase.borland.com (Pete Becker) writes:
|> In article <Cp92tF.5Mu@draco.macsch.com>,
|> >Why not have a "definition" and "declaration" distinctions for classes too?
|> >The declaration specifies the interface and advertizes all public attributes
|> >and methods, and the definition specifies the implementation.
|> Where would you put the definition of an inline function? Especially
|> if the inline function calls a private member function?
Put the inline section in the definition section, but
advertise it as "inline" in the declaration section.
Inlining, or lack of it, should be semantically transparent,
right? It's a compile-time choice whether or not inlines
are really performed (could be on a per-package basis, if
you want) and the programming environment keeps track of
who actually inlined what from where, and whether they
need to be recompiled.
So, if inlining is on, then the compiler goes out and snarfs
it out of the relevant file (and clues in the make-like tool
that foo.o also depends upon bar.c). If the file is missing,
or the function is not found there, then issue a warning, don't
create the dependence, and just use the usual subroutine linkage.
There may be cascaded dependences created by the contents of the inline
function, since it may depend upon offsets that come from yet another file.
At any rate, you're no worse off than you are now, since the dependences
are the same. Done right, this is a major pain, since you need to keep
track of where compilation facts come from on a fine-grained-basis, but a
"done-wrong" solution is probably no worse than what we live with today
(note that it can be worse, since definitions typically depend on more
stuff than declarations do).
So, when first developing, you probably don't inline, but when you
start to care about performance, you do. Both times, the interface/
declaration files are much less cluttered than they are now.
How's that?
David Chase, speaking for myself
Thinking Machines Corp.
Author: pdlogan@ccm.jf.intel.com (Patrick D. Logan)
Date: Fri, 6 May 1994 10:51:49 Raw View
In article <1994May5.145415.27579@unix.brighton.ac.uk> je@unix.brighton.ac.uk (John English) writes:
> Anyway, what's wrong with
> class X {
> friend class Xprivate;
> public:
> ...
> private:
> Xprivate* privateData;
> };
>and then access your private data via a pointer?... A more serious
>difficulty is that the Xprivate object has to be created using new in the
>X constructor, which can cause problems if your compiler doesn't support
>exceptions.
Well it doesn't *have* to be new'd in the constuructor in all cases. It could
be created on demand. It could come from a pre-allocated pool of space. Etc.
etc.
Author: stt@spock.camb.inmet.com (Tucker Taft)
Date: Fri, 6 May 1994 19:55:44 GMT Raw View
In article <768219472snz@wslint.demon.co.uk>,
Kevlin Henney <Kevlin@wslint.demon.co.uk> wrote:
>>>... C++ is value based, so statically determined
>>>object size is _very_ important.
>
>The following, however, is relevant:
>
>>Ada 9X is also value based, but the programmer is allowed to
>>declare variables (such as arrays or class-wide variables) whose
>>size is not known at compile-time. A separate "secondary" mark/release
>>stack or the equivalent of "alloca()" can be used to implement such
>>variables quite efficiently.
>
>I wasn't aware of this in Ada 9X. This seems like a reasonable solution
>for a number of languages, but not C++: the definition of and
>constraints on the sizeof operator are the key here.
Actually, since ANSI C disallows the use of sizeof in #if
preprocessor statements, there is not much in the way
of "fundamental" dependence on the constantness of sizeof.
It would make very little difference if sizeof was sometimes
non-constant, since it would only affect uses where it was
in fact not constant, none of which exist yet, of course.
> ... Yes, C++ is
>weighed down by its C heritage, but that is also one of its strengths.
>Take that away and it is no longer C++ and we are then discussing a
>different language with very different design criteria.
Actually, GNU C already supports local array variables whose
size is not known at compile time (I presume g++ does as well),
and so it is clearly not fundamental to all C implementations
that all variables have a compile-time known size. I believe
GNU C uses alloca() to implement such arrays. The presence
of this feature in GNU C in no way slows down the rest of the
language.
>Kevlin Henney
S. Tucker Taft stt@inmet.com
Intermetrics, Inc.
Cambridge, MA 02138
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Fri, 6 May 1994 05:53:22 GMT Raw View
In article <Cp93ow.nA@borland.com> pete@genghis.interbase.borland.com (Pete Becker) writes:
>In article <Cp92tF.5Mu@draco.macsch.com>,
>>
>>Why not have a "definition" and "declaration" distinctions for classes too?
>>The declaration specifies the interface and advertizes all public attributes
>>and methods, and the definition specifies the implementation.
>
> Where would you put the definition of an inline function? Especially
>if the inline function calls a private member function?
> -- Pete
>
What you do is the opposite. You write EVERYTHING inline.
Then you compile it. Then you
include "myclass";
which loads a precompiled version of the public and protected
interface of the translation unit. A decompiler will show you the
interface. Eiffel calls their's "short".
The need for 'inline' evaporates. The compiler has the
definitions of EVERYTHING available. The user only
sees the public interface. A deriver only sees the protected
interface. The need for a One Definition Rule evaporates.
You 'include' the actual definition you need.
OK, thats extreme. But a few mods to this idea and we have a
proper module system.
Acknowledgement: I first saw the idea of "include unit_name;"
on the Library reflector, where Bjarne was responding to a
problem preventing macros interfering with standard library
headers. Jerry Shwartz was proposing #push macroname,
so you could #undef the names used in the header to protect
them from expansion, then #pop to retrieve the old macros.
(Or something similar)
A language 'include' would at least do preprocesssing
on the included file independently, so you would be
including the already pre-processed file.
I dont know if Bjarne intended it or not, but if we push
that to completely compiling the file, we have a module system.
We also dont need any 'implementation defined' method
of specifying what translation units constitute a program.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: svv@phoenix.dev.macsch.com (Suresh Vaidyanathan)
Date: Fri, 6 May 1994 21:06:18 GMT Raw View
>In article <Cp93ow.nA@borland.com> pete@genghis.interbase.borland.com (Pete Becker) writes:
>>In article <Cp92tF.5Mu@draco.macsch.com>,
>>
>>Why not have a "definition" and "declaration" distinctions for classes too?
>>The declaration specifies the interface and advertizes all public attributes
>>and methods, and the definition specifies the implementation.
>
> Where would you put the definition of an inline function? Especially
>if the inline function calls a private member function?
> -- Pete
>
>
May be I am missing something, but is this a big deal?
All member function definitions are implementation details, so they belong in the
"definition".
You might want to advertize that the function in "inline", so just the keyword
belongs in the "declaration".
Author: nagle@netcom.com (John Nagle)
Date: Fri, 6 May 1994 20:49:32 GMT Raw View
chase@Think.COM (David Chase) writes:
>As to hiding the sizes of objects, I agree that would be a good
>idea, but I'm not sure that would be acceptable to most C++
>programmers. It either slows down execution (extra lookups),
>or requires extra linker support (late binding of object sizes),
>or requires SELF-style on-the-fly compilation (really late, and
>lazy, binding of object sizes). One reason people program in
>C++ is because they think their code will run faster than some
>of the other OO alternatives, so the first choice is not good.
>Another reason people program in C++ is so that they will be
>able to run on a wide range of standard-ish platforms and
>interoperate with C, so the second choice is not good (linkers
>seem to be a perennial obstacle, I do not know why).
Acceptance of dumb linkers is warping the whole language
out of shape. There's no excuse for this, especially when the
linker and compiler come from the same vendor, as seems to be
increasingly common. The linker problem seems to be a heiritage from
the old UNIX days when the linker was in assembler and nobody understood it.
If you're willing to front-end a C compiler to get a C++ compiler,
what's so bad about front-ending the linker to get a C++ linker?
John Nagle
Author: leech@cs.unc.edu (Jon Leech)
Date: 6 May 1994 23:56:24 -0400 Raw View
In article <nagleCpEF6K.859@netcom.com>, John Nagle <nagle@netcom.com> wrote:
> If you're willing to front-end a C compiler to get a C++ compiler,
>what's so bad about front-ending the linker to get a C++ linker?
Different problem. Cfront generates code that works with any
sufficiently robust C compiler, with only size/alignment properties of the
target architecture as parameters. How could you possibly generate a linker
frontend that's portable?
Jon
__@/
Author: rfg@netcom.com (Ronald F. Guilmette)
Date: Sat, 7 May 1994 08:47:49 GMT Raw View
In article <CpECow.9o7@inmet.camb.inmet.com> stt@spock.camb.inmet.com (Tucker Taft) writes:
>
>Actually, since ANSI C disallows the use of sizeof in #if
>preprocessor statements, there is not much in the way
>of "fundamental" dependence on the constantness of sizeof.
>It would make very little difference if sizeof was sometimes
>non-constant, since it would only affect uses where it was
>in fact not constant, none of which exist yet, of course.
I feel compelled to mention that I suggested (some long time ago) that in
C++, one should be permitted to overload `sizeof' on a per-class basis...
and that it should be possible to declare it as a virtual function member.
>> ... Yes, C++ is
>>weighed down by its C heritage, but that is also one of its strengths.
>>Take that away and it is no longer C++ and we are then discussing a
>>different language with very different design criteria.
>
>Actually, GNU C already supports local array variables whose
>size is not known at compile time (I presume g++ does as well),
>and so it is clearly not fundamental to all C implementations
>that all variables have a compile-time known size. I believe
>GNU C uses alloca() to implement such arrays. The presence
>of this feature in GNU C in no way slows down the rest of the
>language.
I have been awaye of the GNU C ``dynamic arrays'' feature for quite some
time, but since the discussion here brought up the subject of `sizeof'
I thought that I would try a small experiment, to wit:
int size;
int main ()
{
char array[size];
int i = sizeof (array);
printf ("%d\n", i);
return 0;
}
int size = 100;
This program, when compiled with GCC 2.5.8, prints 100. I find that fact
interesting.
--
-- Ron Guilmette, Sunnyvale, CA ---------- RG Consulting -------------------
---- domain addr: rfg@netcom.com ----------- Purveyors of Compiler Test ----
---- uucp addr: ...!uunet!netcom!rfg ------- Suites and Bullet-Proof Shoes -
Author: jason@cygnus.com (Jason Merrill)
Date: Sat, 7 May 1994 10:20:03 GMT Raw View
>>>>> John Nagle <nagle@netcom.com> writes:
> If you're willing to front-end a C compiler to get a C++ compiler,
> what's so bad about front-ending the linker to get a C++ linker?
Most C++ compilers *do* have special linker support; g++ has collect2 or
GNU ld, Cfront has its collect2-like munger, xlC and Borland C++ have
linkers that merge common pieces of the text segment. However, it seems to
me that late binding of object sizes is outside of the realm of linker
support,
Jason
Author: nagle@netcom.com (John Nagle)
Date: Sat, 7 May 1994 17:34:45 GMT Raw View
leech@cs.unc.edu (Jon Leech) writes:
>In article <nagleCpEF6K.859@netcom.com>, John Nagle <nagle@netcom.com> wrote:
>> If you're willing to front-end a C compiler to get a C++ compiler,
>>what's so bad about front-ending the linker to get a C++ linker?
> Different problem. Cfront generates code that works with any
>sufficiently robust C compiler, with only size/alignment properties of the
>target architecture as parameters. How could you possibly generate a linker
>frontend that's portable?
It's easier today. There are only three formats that really matter;
Microsoft/Intel, Apple, and UNIX. If you get them covered, you have
well over 90% of computing.
John Nagle
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Fri, 6 May 1994 06:48:49 GMT Raw View
In article <2qatrp$9g9@scus1.ctstateu.edu> s3900120@scus1.ctstateu.edu (Student 20) writes:
>include file;
>
>IMHO). Going to this sort of thing in C++ would require a major overhaul,
>however.
No. Its the other way around completely. Supporting
this is TRIVIAL to define and EASY to implement (work is involved,
for sure, and agreement on external file formats).
If we do it, we can solve about 6 major problems
in C++ which result from NOT having a module system. Overhauling
the language definition to solve those problems is certain
to delay Standardisation.
For example, the One Definition Rule. Its complicated.
I wrote the paper which the committe will consider -- not
as a final proposal, but a baseline. It took three months work
to write that paper.
With modules, we dont need a One Definition Rule.
We dont need inline functions. We dont need to write separate
interfaces. We dont have macro's zapping the meaning
of header files. We get order of magnitude speed
increase for compilation.
What else do you want?
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Fri, 6 May 1994 06:39:43 GMT Raw View
In article <2q9gq7$3nm@network.ucsd.edu> mbk@inls1.ucsd.edu (Matt Kennel) writes:
>Ron Rossbach (ej070@cleveland.Freenet.Edu) wrote:
>
>: The real problem (IMHO) is requiring C++ programs which use
>: a particular class to #include the complete (including privates)
>: declaration of the class. As mentioned before, other languages
>: like Ada get this right; a user should only need to #include the
>: public interface.
>
>: I realize this creates problems for object sizing, but they are
>: fixable, I would think.
>
>I personally think that C++ ought to make #include and the
>use of "header files" completely optional, and obsolete.
I agree, and Bjarne has suggested a way of doing it.
(At least, he suggested something like it).
include unit_name; // include interface of compiled unit
It seems to me we can have a full scale, proper
module system in C++ if we want it.
Hey, users out there WILL YOU WAIT FOR THIS?
Because current indications are most users just want to Standardise
ANYTHING and NOW (preferably yesterday <grin>). Which would
delay such a major improvement by 5 years or more.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd, CSERVE:10236.1703
6 MacKay St ASHFIELD, Mem: SA IT/9/22,SC22/WG21
NSW 2131, AUSTRALIA
Author: mbk@inls1.ucsd.edu (Matt Kennel)
Date: 8 May 1994 00:27:48 GMT Raw View
Suresh Vaidyanathan (svv@phoenix.dev.macsch.com) wrote:
: >In article <Cp93ow.nA@borland.com> pete@genghis.interbase.borland.com (Pete Becker) writes:
: >>In article <Cp92tF.5Mu@draco.macsch.com>,
: >>
: >>Why not have a "definition" and "declaration" distinctions for classes too?
: >>The declaration specifies the interface and advertizes all public attributes
: >>and methods, and the definition specifies the implementation.
: >
: > Where would you put the definition of an inline function? Especially
: >if the inline function calls a private member function?
: > -- Pete
: >
: >
: May be I am missing something, but is this a big deal?
: All member function definitions are implementation details, so they belong in the
: "definition".
: You might want to advertize that the function in "inline", so just the keyword
: belongs in the "declaration".
Why do we need to make yet another separation between "declaration" and
"definition" and all that?
The abstract type is the abstract 'declaration'. The concrete subtypes
are concrete 'definition'. Use them.
Why ought our language have some self-imposed blindness in that it can
only "know" about those pieces of text that are (using an odd mapping
between space and time) "before" those pieces that are "after"?
Why ought our language need to be able to process one source file at time
without knowing what the other ones are?
--
-Matt Kennel mbk@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".
Author: fjh@munta.cs.mu.OZ.AU (Fergus Henderson)
Date: Sun, 8 May 1994 04:46:32 GMT Raw View
maxtal@physics.su.OZ.AU (John Max Skaller) writes:
[discussion of adding a module system to C++]
> What else do you want?
Backward compatibility.
> If we do it, we can solve about 6 major problems
>in C++ which result from NOT having a module system. Overhauling
>the language definition to solve those problems is certain
>to delay Standardisation.
But you will still have to solve those problems anyway, unless
you abandon backward compatibility.
--
Fergus Henderson - fjh@munta.cs.mu.oz.au
Author: d88-jwa@dront.nada.kth.se (Jon W tte)
Date: 8 May 1994 10:01:30 GMT Raw View
In <9412814.10061@mulga.cs.mu.OZ.AU> fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
>[discussion of adding a module system to C++]
>> What else do you want?
>Backward compatibility.
I don't see how backwards compatibility comes into it; the usage
of a module system would be orthogonal to the current headers-and-
sources inclusion strategy.
Old files do #include, and it stil works. Your module could still do it
when you build it. New files do include modules, and compile much faster
with less unneccesary re-compiles.
The problems we have with the current C++ definition are of the type
"things you have to live with" and will probably NOT be solved for
non-module compilations.
--
-- Jon W{tte, h+@nada.kth.se, Mac Hacker Deluxe (on a Swedish scale) --
There's no sex act that can't be made better with Jell-O.
Author: fjh@munta.cs.mu.OZ.AU (Fergus Henderson)
Date: Sun, 8 May 1994 14:09:38 GMT Raw View
d88-jwa@dront.nada.kth.se (Jon W tte) writes:
>fjh@munta.cs.mu.OZ.AU (Fergus Henderson) writes:
>
>>[discussion of adding a module system to C++]
>>> What else do you want?
>
>>Backward compatibility.
>
>I don't see how backwards compatibility comes into it; the usage
>of a module system would be orthogonal to the current headers-and-
>sources inclusion strategy.
Sure. I don't disagree with this. Perhaps I didn't make my point clear.
John Skaller seemed to be suggesting that introducing a module
system wouldn't slow down standardization much, because it would
solve various problems with the current way of doing things.
My point was that the committee would still have to worry about
those issues (such as the one definition rule), because of the
need for backwards compatibility, and so introducing a module
system *would* slow down standardization.
Personally, I think that although the lack of a decent module system is
a very significant flaw in C++, the language is already complex
enough. But for all those budding language designers out there - if
you do want to add a module system to C++, I think the declaration
should be `import module;' rather than `include module;'.
--
Fergus Henderson - fjh@munta.cs.mu.oz.au
Author: als@tusc.com.au (Anthony Shipman)
Date: 9 May 1994 16:15:46 +1000 Raw View
In <nagleCpEF6K.859@netcom.com> nagle@netcom.com (John Nagle) writes:
>chase@Think.COM (David Chase) writes:
>>As to hiding the sizes of objects, I agree that would be a good
>>idea, but I'm not sure that would be acceptable to most C++
>>programmers. It either slows down execution (extra lookups),
>>or requires extra linker support (late binding of object sizes),
>>or requires SELF-style on-the-fly compilation (really late, and
>>lazy, binding of object sizes). One reason people program in
>>C++ is because they think their code will run faster than some
>>of the other OO alternatives, so the first choice is not good.
>>Another reason people program in C++ is so that they will be
>>able to run on a wide range of standard-ish platforms and
>>interoperate with C, so the second choice is not good (linkers
>>seem to be a perennial obstacle, I do not know why).
> Acceptance of dumb linkers is warping the whole language
>out of shape. There's no excuse for this, especially when the
>linker and compiler come from the same vendor, as seems to be
>increasingly common. The linker problem seems to be a heiritage from
>the old UNIX days when the linker was in assembler and nobody understood it.
In the good old days big iron linkers supported link-time expressions.
A value used in an immediate field in an instruction could be represented as
an expression that was evaluated by the linker. This is what you need for
hiding the object sizes. E.g.
movl %r1, #(4 + _struct_x_size)*3
The assembler or compiler generated a "polish fixup" which was a bit of RPN
to be evaluated by the linker when _struct_x_size became known.
--
Anthony Shipman "You've got to be taught before it's too late,
TUSC Computer Systems Pty Ltd Before you are six or seven or eight,
To hate all the people your relatives hate,
E-mail: als@tusc.oz.au You've got to be carefully taught." R&H