Topic: ANSI C++ "Reference" Implementation


Author: maxtal@Physics.usyd.edu.au (John Max Skaller)
Date: 1995/06/28
Raw View
In article <3s9ict$ap7@News1.mcs.com>,
Jim Fleming <jim.fleming@bytes.com> wrote:
>
>When the ANSI C++ Language standard is complete will there be a "reference"
>implementation?

 The UK is noted for producing such entities.
There is at least one such "validating" compiler (it generates
messages not any code).

--
        JOHN (MAX) SKALLER,         INTERNET:maxtal@suphys.physics.su.oz.au
 Maxtal Pty Ltd,
        81A Glebe Point Rd, GLEBE   Mem: SA IT/9/22,SC22/WG21
        NSW 2037, AUSTRALIA     Phone: 61-2-566-2189





Author: kanze@gabi-soft.fr (J. Kanze)
Date: 1995/06/26
Raw View
Steve Clamage (clamage@Eng.Sun.COM) wrote:
|> In article in5@bmdhh222.bnr.ca, dbinder@bnr.ca (David Binderman) writes:
|> >Steve Clamage (clamage@Eng.Sun.COM) wrote:
|> >
|> >
|> >: Unix compilers available today offer all of the ARM features, updated
|> >: by C++ Committee enhancements.
|> >
|> >Some UNIX compilers might, but I've never seen one.

|> Oh, please! Sun, HP, and IBM (for example) sell Unix C++ compilers in the UK.
|> I'm sure there are others as well.

I suspect that it depends on what kind of software you develop.  I've
seen the Sun compiler because I bought it myself; I cannot use it on my
customer sites, because our target machine is a home-built embedded
processor, using SVR4

If you are only worried about the mainstream Unix providers (like Sun,
IBM and HP), you can generally get a quite good compiler; under Solaris,
I even have the choice of several.  If you are targetting a home-build
embedded Unix, you take what you can get, which often isn't that good.

Our supplier for Motorola 88K Unix SVR4 has only just upgraded to Cfront
3.0.1.  And have just informed us that they would not be supporting the
88K in the future (or so I have heard).  So I spend a significant amount
of time translating the ideas in Barton and Nackman into code using
generic.h.  And getting complaints that the results are unreadable,
which are, in fact, justified.
--
James Kanze           (+33) 88 14 49 00          email: kanze@gabi-soft.fr
GABI Software, Sarl., 8 rue des Francs Bourgeois, 67000 Strasbourg, France
Conseils en informatique industrielle--
                             --Beratung in industrieller Datenverarbeitung





Author: shepherd@debussy.sbi.com (Marc Shepherd)
Date: 1995/06/26
Raw View
In article drv@bmdhh222.bnr.ca, dbinder@bnr.ca (David Binderman) writes:
>I've always thought that C++ should go the way of Ada.
>
>Ie if you have a compiler, you cannot call it a C++ compiler until
>it passes some reference test suite.

In fact, there are several commercial test suites available for C++
compilers; vendors who are concerned about portability may insist that
they will only buy compilers that have passed muster under one of the
well-known suites (e.g. Plumhall, Guilmette).

>
>I'm tired of having to manually adjust my code for each new C++
>compiler that comes along, not only for what version of C++ the
>compiler claims to implement, but also the set of bugs in the compiler.

The main problem with C++ is that it has evolved so quickly.  Because
of this, the compiler vendors have never been able to keep up with the
language definition.

Even traditional C tended to vary from one platform to another (albeit
not as much as C++), but the language is now stable thanks to the ANSI
Standard.

The comparison with Ada is particularly unfelicitous.  C++ started life
in a research lab; it evolved piecemeal, in response to the experiences
of real users.  Ada was designed by a scholarly committee: no production
Ada programs existed before there was a standard.  The language simply
never got the chance to meander the way C++ and C did.

After the ISO/ANSI Standard is published,  I expect the state of the
art in C++ compilers to go way up.  There will be exactly ONE definition
of the language to code to, and people can be fairly certain that that
definition will remain stable for a good 7-10 years.


---
Marc Shepherd
Salomon Brothers Inc
shepherd@schubert.sbi.com The opinions I express are no one's but mine!






Author: adams@stay.sps.mot.com (Adam Seligman)
Date: 1995/06/26
Raw View
A grammar, a set of type-checking rules, and an operational semantics of a
language can specify both the language and suggest an algorithm that a
compiler would use to process it.

If you want to see an example, see http://cs.williams.edu/~kim
If you want to see it for C++, see the ftp site in the sig.

Adam Seligman
95als@cs.williams.edu
Formal Analysis of C++ (Semantics and type rules)
  ftp://cs.williams.edu/pub/students/95als/main.ps or main.dvi





Author: jls@summit.novell.com (Schilling J.)
Date: 1995/06/23
Raw View
In article <3sc3vl$dnr@engnews2.Eng.Sun.COM> clamage@Eng.Sun.COM writes:
>In article drv@bmdhh222.bnr.ca, dbinder@bnr.ca (David Binderman) writes:
>>I've always thought that C++ should go the way of Ada.
>>
>>Ie if you have a compiler, you cannot call it a C++ compiler until
>>it passes some reference test suite.
>
>There are problems with "official" test suites. If a government body
>says what is "official" what happens when another government designates
>a different suite as "official" and no compiler can pass both suites
>because the suites disagree on some issues.

In the case of Ada, there's only one reference test suite (sponsored by
the U.S. government, and accepted by everyone else).

>Also, official bodies tend
>to pick test suites based on political and economic considerations,
>rather than solely on technical merit.

Actually, the most common criticism of the official Ada test suite has
been just the reverse:  that it is too technically demanding, that it
spends too much time testing the dark corners of the language, and that
Ada compiler vendors have been forced to spend an economically unwise
amount of time passing the suite on every new compiler version, rather
than adding functional enhancements to their development environments.
(The new test suite for Ada 95 is supposed to rectify some of these
problems.)

The Ada test suite also does not flush out compiler bugs very well.
That is, validated compilers may still have significant numbers of
bugs in them.  (That's largely a tribute to how complicated a language
Ada is.)

But it wasn't intended to find bugs.  The main purpose of the test suite
was to make sure all compilers are on the same page; that is, that they
implement the same language -- nothing more and nothing less.  And in that
goal it was and is very successful.  The effort involved in taking sources
written for Compiler A and getting them to run under Compiler B is much less
for Ada than it is for C++.

--
Jonathan Schilling Novell Systems Group  jls@summit.novell.com





Author: clamage@Eng.Sun.COM (Steve Clamage)
Date: 1995/06/23
Raw View
In article in5@bmdhh222.bnr.ca, dbinder@bnr.ca (David Binderman) writes:
>Steve Clamage (clamage@Eng.Sun.COM) wrote:
>
>
>: Unix compilers available today offer all of the ARM features, updated
>: by C++ Committee enhancements.
>
>Some UNIX compilers might, but I've never seen one.

Oh, please! Sun, HP, and IBM (for example) sell Unix C++ compilers in the UK.
I'm sure there are others as well.

---
Steve Clamage, stephen.clamage@eng.sun.com







Author: "Seth D. Osher" <sosher@plexus.wsoc.com>
Date: 1995/06/23
Raw View
On 22 Jun 1995, Ian Wild wrote:

>
> >
> >With other standards, like time and weight and measurements, people keep
> >a reference so that people can always stay in synch with reality.
>
> Where is the standard Second kept?
>
> Ian
>

Actually the second and the meter have a great recursive definition (I think,
if I'm wrong someone will tell me I;m sure :) ). The meter is the distance
light travels, in a vacuum, in x.xxE-xx seconds, and a second is how much
time it takes light to travel some distance (maybe they now use an atomic
decay).

   Seth D. Osher     SE/I Department Computer Analysis  _/_/ _/_/ _/_/
   Internet: sosher@plexus.wsoc.com                    _/_/ _/_/ _/
   Tel. (703) 883-8573   Fax. (703) 883-8788          _/   _/ _/ _/_/
   PRC Inc., Mail Stop TM 431, 1505 PRC Drive, McLean, VA 22102





Author: matt@godzilla.EECS.Berkeley.EDU
Date: 1995/06/24
Raw View
In article <Pine.SUN.3.91.950623104557.20978E-100000@plexus.wsoc.com> "Seth D. Osher" <sosher@plexus.wsoc.com> writes:

> > >With other standards, like time and weight and measurements, people keep
> > >a reference so that people can always stay in synch with reality.
> >
> > Where is the standard Second kept?
>
> Actually the second and the meter have a great recursive definition (I think,
> if I'm wrong someone will tell me I;m sure :) ). The meter is the distance
> light travels, in a vacuum, in x.xxE-xx seconds, and a second is how much
> time it takes light to travel some distance (maybe they now use an atomic
> decay).

No.  You can only use this definition in one direction!  The meter is
defined in terms of the speed of light (by definition, c is exactly
2.99792458x10^8 m/s), and the second is defined in terms of the
frequency of a particular atomic transition.

The kilogram is defined as the mass of a particular lump of metal
that's kept in a lab outside Paris.  And actually, the people who care
about things like this have recently realized that there might be a
problem: there is some reason to think that the mass of this lump is
changing slightly because of dust accumulation and cleaning.

These definitions aren't necessarily the most elegant: The entirely
pragmatic motivation is that experimental measurements of frequencies
are very precise, but experimental measurements of length aren't so
precise.  As for mass, there aren't any known techniques that are any
better than comparison to a known object.

Does this have a moral for C++?  Only a very indirect one: standards
should be defined so that they're useful, and pragmatic, real-world
considerations are important.  That's why a "reference" implementation
of C++ would be pointless.  First, there's no quick way of determining
that two implementations actually are the same.  (Compilers aren't
like blocks of iridium.)  You'd still have to test case after case; it
wouldn't be any easier than determining whether a single compiler
matches a standard.  Second, as we can see from the kilogram story,
defining something in terms of a particular exemplar raises serious
problems when that exemplar is itself imperfect.  We can live with
this problem when we're talking about lumps of metal, but we couldn't
live with it if we were talking about language implementations.
--
Matt Austern          matt@physics.berkeley.edu
http://dogbert.lbl.gov/~matt





Author: jarmo@ksv.ksvltd.fi (Jarmo Raiha)
Date: 1995/06/24
Raw View
jim.fleming@bytes.com (Jim Fleming) writes:


>When the ANSI C++ Language standard is complete will there be a "reference"
>implementation? In other words, a compiler which will provide the "final
>word" in resolving semantic debates and other issues.

I don't think we need a reference compiler. What we need is a reference
way of making the compiler. The standard should be formulated
in such a way that the compiler **can** be produced mechanically
using proven algorithms. At least upto generating some hypotetical
target code. (With of course a standard hypotetical cpu executing it)

For example if the grammar were pure context free LR there exist theory and
generators to generate regognizers for that grammar. Unfortunately
C++ is no CFG and we need a formal definition and generators for
symboltables with namespaces , code generation etc.

Now if an implementor wants to prove his design correct he only
has to prove his set of algoriths equivalent to the standard definitions.
Of course bugs will be there , but correcting them will move the compiler
closer to the the standard.


Well , this is just my imagination.
I have no idea if c++ (or any other practical language) can ever be
formulated in such a way but I think that only then we can have a true
standard and a reference implemention.

Anyhow, such a rewrite of c++ will newer come.

>With other standards, like time and weight and measurements, people keep
>a reference so that people can always stay in synch with reality. Is this
>same approach possible with a computer language?

As an analogy to measurement techniques.
In older days there were standard battery cells for keeping primary voltage
standard. And they are still here for short term storage.
But now we can trace the voltage back to frequency (time) by using
the properties of a Josephson-junction. That is - we need only a frequency
standard and a proven f-V translator to have both frequency and voltage.

The implementation of the f->V translator can be buggy but the point is that
there is at least the theory behind that translation. While it holds
we can debug the translator to get closer and closer.

Jarmo Raiha





Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/24
Raw View
In article <jarmo.803988113@ksv>, jarmo@ksv.ksvltd.fi says...
>
>jim.fleming@bytes.com (Jim Fleming) writes:
>
>
>>When the ANSI C++ Language standard is complete will there be a
"reference"
>>implementation? In other words, a compiler which will provide the "final
>>word" in resolving semantic debates and other issues.
>
>I don't think we need a reference compiler. What we need is a reference
>way of making the compiler. The standard should be formulated
>in such a way that the compiler **can** be produced mechanically
>using proven algorithms. At least upto generating some hypotetical
>target code. (With of course a standard hypotetical cpu executing it)
>
@@@@@@@
This is interesting. What would be the input?

Would you feed a description of the grammar into the system?
What about the semantic information?

Would the hypothetical cpu be based on a "procedural" model or an
 "object" model? Since C++ handles both views of the world
 would the "engine" have to have a split personality?

Maybe the reference compiler could be generated so that if a program
does *not* contain any OO constructs then procedural code is emitted
for a "standard hypothetical" procedural cpu. If OO constructs are
detected then a lower quality reference check is done because it would
be harder for the compiler to ensure that the OO paradigm and the
procedural paradigm were not "mixed".

If there are OO constructs in the code then maybe the reference compiler
could switch modes and issue tons of warnings to programmers who try
to step around the OO paradigm. In this way, maybe the compiler would
help to "steer" people toward real OO programming.

@@@@@@@

>For example if the grammar were pure context free LR there exist theory and
>generators to generate regognizers for that grammar. Unfortunately
>C++ is no CFG and we need a formal definition and generators for
>symboltables with namespaces , code generation etc.
>
>Now if an implementor wants to prove his design correct he only
>has to prove his set of algoriths equivalent to the standard definitions.
>Of course bugs will be there , but correcting them will move the compiler
>closer to the the standard.
>

@@@@@@@

This reminds me a little of the work that Stephen Johnson and others
pioneered on the early C compilers at Bell Labs. They had a technique
where it was easy to port the compiler because they had little scripts
for doing certain things (like increment a variable). As the compiler
designers learned more about the underlying machine, they extended the
scripts to allow access to the special instructions built into the
machine (for example auto-increment).

The net result was that the compiler ported rather quickly, and then
as it was tuned, the code produced got better and better. This followed
the UNIX philosophy of getting something working quickly and then
evolving the system to a better and better solution. At each step,
regression testing could be done and if something failed, there was
a previous version to fall back to.

@@@@@@@
>
>Well , this is just my imagination.
>I have no idea if c++ (or any other practical language) can ever be
>formulated in such a way but I think that only then we can have a true
>standard and a reference implemention.
>
>Anyhow, such a rewrite of c++ will newer come.
                               ^^^^^^^^^^^^^^^
>
@@@@@@@

It seems to me that C++ could benefit from a solid definition of a
"Virtual C++ Machine". Smalltalk has one. C+@ has one. Java has one.

If a solid virtual C++ machine were available as a "reference cpu"
then compiler vendors could as a first cut produce code for the virtual
C++ cpu. Maybe an interpreter or emulator could be used to run this
code to test known problems. This gets into the test suite arena.

Once a compiler vendor reached a certain level of conformance, then
they could add value in various ways. I suspect that most would focus
on class libraries and development environment capabilities, because
after all, designers and programmers care more in this day and age
about reuse and CASE tools than whether the language uses an "=" or ":=".

@@@@@@@

[snipped interesting discussion on electrical standards]
>
>Jarmo Raiha

@@@@@@@

I think there are many areas for fertile research here. I am curious
about your comment that...

>Anyhow, such a rewrite of c++ will newer come.
                               ^^^^^^^^^^^^^^^
...why is it that you do not think that more research will go into this?

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
BTW...many people make the assumption that large research labs are 5 years
ahead of the market. Twenty years ago that was probably true. As citizens
of the Internet, you now have a chance to track the leading edge research
on an hour by hour basis. Some people do not want you to be able to do
this or participate in the process. As we approach the July 4th weekend
in the U.S., hopefully everyone around the world will appreciate that free
speech and the exchange of ideas and viewpoints allows everyone to enjoy
the fruits of the planet's technology and the benefits of discussions with
other humans that may not have been born, educated, or employed in...

 ..."The Right Place"...;+)

--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|






Author: jarmo@ksv.ksvltd.fi (Jarmo Raiha)
Date: 1995/06/24
Raw View
jim.fleming@bytes.com (Jim Fleming) writes:

>In article <jarmo.803988113@ksv>, jarmo@ksv.ksvltd.fi says...

>>I don't think we need a reference compiler. What we need is a reference
>>way of making the compiler. The standard should be formulated
>>in such a way that the compiler **can** be produced mechanically
>>using proven algorithms. At least upto generating some hypotetical
>>target code. (With of course a standard hypotetical cpu executing it)

>This is interesting. What would be the input?

It would be the language standard. Output will be the compiler :-)#

>Would you feed a description of the grammar into the system?
>What about the semantic information?

Prefereably, yes.
This is just what I was wondering. Is it possible to express
semantic information etc. in a way that could make an automatic
generation feasible ?

>Would the hypothetical cpu be based on a "procedural" model or an
> "object" model? Since C++ handles both views of the world

I just wanted to leave out the actual code generation phase.
When we are at the stage of generating actual  machine code the
conformance is either proved or disproved.

Detecting the equivalence of the intermediate representation
before actual machine code generation would be enough.

-----------------
I am not speaking for a compiler that would be the final one
but a definite set of algoritms to build one.
Not a shopping list standard.
-----------------

>It seems to me that C++ could benefit from a solid definition of a
>"Virtual C++ Machine". Smalltalk has one. C+@ has one. Java has one.

Indeed. This would be the stage where the correctness of the compilers
would be evaluated.


Jarmo Raiha





Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/24
Raw View
In article <jarmo.804025501@ksv>, jarmo@ksv.ksvltd.fi says...
>
>jim.fleming@bytes.com (Jim Fleming) writes:
>
>>In article <jarmo.803988113@ksv>, jarmo@ksv.ksvltd.fi says...
>
>>>I don't think we need a reference compiler. What we need is a reference
>>>way of making the compiler. The standard should be formulated
>>>in such a way that the compiler **can** be produced mechanically
>>>using proven algorithms. At least upto generating some hypotetical
>>>target code. (With of course a standard hypotetical cpu executing it)
>
>>This is interesting. What would be the input?
>
>It would be the language standard. Output will be the compiler :-)#
>
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
EEK-THE-C+@...:)

WOW...that would be a major league parsing job or the "standard" would
have to be rewritten into a format that a machine could understand.
This might be good because many languages which are now used to
communicate with machines have a lot of precision and people can learn
to develop a common understanding of the writings in the language.

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
>>Would you feed a description of the grammar into the system?
>>What about the semantic information?
>
>Prefereably, yes.
>This is just what I was wondering. Is it possible to express
>semantic information etc. in a way that could make an automatic
>generation feasible ?
>
@@@@@@@@@@@@@@@@@@@@@@

This is a tough one...it seems to me that language syntax specification
is very disconnected from semantic specification. I have advocated for
some time that syntax should be simple and semantics should be precise.

I need to think about how this might be done...
 ...hopefully, one of the experts will tell us...:)

@@@@@@@@@@@@@@@@@@@@@@

>>Would the hypothetical cpu be based on a "procedural" model or an
>>       "object" model? Since C++ handles both views of the world
>
>I just wanted to leave out the actual code generation phase.
>When we are at the stage of generating actual  machine code the
>conformance is either proved or disproved.
>
>Detecting the equivalence of the intermediate representation
>before actual machine code generation would be enough.
>
@@@@@@@@@@@@@@@@@

This is interesting that you bring up intermediate representation.
One of the differences in C+@ and Java is that C+@ has a sophisticated
bi-level representation scheme. The compiler produces binaries which
are largely binary representations of the syntax. These binaries
are converted on-the-fly at runtime into another level of abstraction
which is more like a virtual C+@ machine. It is from there that the
methods are "beaded" into the target architecture.

Using this approach, we might be able to specify an intermediate
form for C++. My question still remains though regarding whether
that should support both procedural and OO paradigms and whether
they should be separated as an integrated unit or with some sort
of partitioning.

>-----------------
>I am not speaking for a compiler that would be the final one
>but a definite set of algoritms to build one.
>Not a shopping list standard.
>-----------------
>
>>It seems to me that C++ could benefit from a solid definition of a
>>"Virtual C++ Machine". Smalltalk has one. C+@ has one. Java has one.
>
>Indeed. This would be the stage where the correctness of the compilers
>would be evaluated.
>
>Jarmo Raiha
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

At that stage we could verify that they crunched the syntax correctly.
Hopefully, we could also verify that they picked up some of the semantic
information in the program.

If the approach was taken, were we assume that *many* programs do not
and will never contain any OO constructs then we might be able to first
classify a program as being Non-OO. Those programs could then be put
through a different verification suite.

What may be possible is that in the interim, we might be able to verify
that a certain number of the compilers passed the "procedural certification"
phase. This might be all most people need because they never intend to
use the OO extensions.

As for the OO side of the certification and verification, it might be
good to develop a style guide that allows programmers to be "steered"
in a direction where their OO programs have some hope of migrating to
Java or to C+@. This could be added as an extension to the ANSI/ISO
standard and may help to give C++ users hope that their OO software will
have some useful life once they break free of the procedural shackles.

--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|






Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/24
Raw View
In article <3sbakd$epa@dolphin.pst.cfmu.eurocontrol.be>,
ian@cfmu.eurocontrol.be says...
>
>jim.fleming@bytes.com (Jim Fleming) wrote:
>>
>>When the ANSI C++ Language standard is complete will there be a
"reference"
>>implementation? In other words, a compiler which will provide the "final
>>word" in resolving semantic debates and other issues.
>>
>>With the recent discussion in this group on the "Pentium-like" bug in some
>>of the C++ compilers, it seems like it might be useful to have a compiler
>>which is agreed to be the "reference implementation".
>
>Will this reference compiler be bug free?  If not, do all other compilers
>have to duplicate its bugs?  And will there be a reference compiler for
>each possible platform, or just the one?  If a reference compiler were
>running on an early Pentium, would all other machines be forced to copy
>its floating point behaviour?
>
@@@@

I don't think that any compiler (or software) can be bug-free. I would hope
that if a compiler was selected to be the "reference" it would be a high
quality piece of software. After all, the estimate is that $100,000,000 has
been spent just developing the C++ standard, I would assume that a few bucks
could be put into a reference version.

As for multiple platforms, I would assume that the ideal scenario would
be to have this reference compiler written in C++, it should be able to
compile itself, and it should be easily adapted to a new platform. Because
it uses object technology portability and reuse should be high on the list
of attributes and therefore, the reference compiler should be easily
ported.

As for the early Pentiums, I do not think their problem was the language
or the compiler but the implementation of the floating point processor.
I do not think that code had to be recompiled when Intel issued the new
processors without the bug. BTW, if C++ was using "true objects" then
not only would the Pentium bug be easier to discover but also a fix could
have been put in without changing the chip to at least allow users to
function until the silicon could be changed.

@@@@
[snip]
>
>Ian
>

--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|






Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/24
Raw View
In article <3se12o$in5@bmdhh222.bnr.ca>, dbinder@bnr.ca says...
>
>Steve Clamage (clamage@Eng.Sun.COM) wrote:
[snip]
>: The purpose of the standard is to provide a reference for the meaning
>: of well-formed programs. The definition of C++ has been changing over
>: the years, and should stabilize with the publication of the standard.
>
>Yes, but the standard is a *long* way from what folks are using now,
>and what compilers do now. I can easily see the fight to implement
>ANSI C++ destroying wide spread industrial use of C++.
>
>I can also easily see that the complexities of the post-ARM features
>make C++ too complex for widespread use.
>
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Now that the standard has been published, are there any companies that
have announced dates for when their compilers will meet the standard?
Are there any conformance test suites announced? Is there a central
body or company, that is ready to certify conformance?

Many people argue against other programming languages because of a lack
of a "product" or a commercial version. It seems to me that C++ suffers
from this same lack of a working, tested, and/or supported version.

Why is there a "double standard" in this area? Should not people be
saying, "I am not going to use C++ until someone produces a version
that supports the published specification for C++". That is the argument
that people love to use for other languages. Why does C++ get a free
pass in this area?

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
[snip]

>: >As things stand now, I'm still waiting for most UNIX compilers to
>: >get up to the standard presented in the ARM.
>
[snip]
>
>David C Binderman MSc BSc       dbinder@bnr.co.uk         +44 1628 794 887

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

The Copyright date on the "ARM" is 1990. That is 5 years ago. Are you
saying that in 5 years, vendors still have not brought their products
up to the specification described in the ARM?

If so, can you give us any major reasons why this is the case?

Are vendors finding that it is better to be "non-standard" to help
lock in their customers?

Also, since the ARM does NOT include the class library which IS included
in the standard, can you give us any projections on how long it will
take for vendors to bring their class libraries into conformance?

BTW, how are vendors going to achieve this class library conformance
when the standard carries copyright restrictions?

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|






Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/24
Raw View
In article <MATT.95Jun23173604@godzilla.EECS.Berkeley.EDU>,
matt@godzilla.EECS.Berkeley.EDU says...
>
>In article <Pine.SUN.3.91.950623104557.20978E-100000@plexus.wsoc.com> "Seth
D. Osher" <sos
>her@plexus.wsoc.com> writes:
>
>> > >With other standards, like time and weight and measurements, people
keep
>> > >a reference so that people can always stay in synch with reality.
>> >
>> > Where is the standard Second kept?
>>
>> Actually the second and the meter have a great recursive definition (I
think,
>> if I'm wrong someone will tell me I;m sure :) ). The meter is the
distance
>> light travels, in a vacuum, in x.xxE-xx seconds, and a second is how much
>> time it takes light to travel some distance (maybe they now use an atomic
>> decay).
>
>No.  You can only use this definition in one direction!  The meter is
>defined in terms of the speed of light (by definition, c is exactly
>2.99792458x10^8 m/s), and the second is defined in terms of the
>frequency of a particular atomic transition.
>
>The kilogram is defined as the mass of a particular lump of metal
>that's kept in a lab outside Paris.  And actually, the people who care
>about things like this have recently realized that there might be a
>problem: there is some reason to think that the mass of this lump is
>changing slightly because of dust accumulation and cleaning.
>
>These definitions aren't necessarily the most elegant: The entirely
>pragmatic motivation is that experimental measurements of frequencies
>are very precise, but experimental measurements of length aren't so
>precise.  As for mass, there aren't any known techniques that are any
>better than comparison to a known object.
>
>Does this have a moral for C++?  Only a very indirect one: standards
>should be defined so that they're useful, and pragmatic, real-world
>considerations are important.  That's why a "reference" implementation
>of C++ would be pointless.  First, there's no quick way of determining
>that two implementations actually are the same.  (Compilers aren't
>like blocks of iridium.)  You'd still have to test case after case; it
>wouldn't be any easier than determining whether a single compiler
>matches a standard.  Second, as we can see from the kilogram story,
>defining something in terms of a particular exemplar raises serious
>problems when that exemplar is itself imperfect.  We can live with
>this problem when we're talking about lumps of metal, but we couldn't
>live with it if we were talking about language implementations.
>--
>Matt Austern                                  matt@physics.berkeley.edu
>http://dogbert.lbl.gov/~matt
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Do you have any suggestions on how conformance can be tested?

Do you think that a central organization should help certify conformance?

Do you think that the conformance to the language standard can be
 separated from conformance with the class library?
 Could companies claim conformance to the language without
 providing the class library?

If you think that the language and class library should be kept together
then doesn't that present an "opportunity" to use the class library as
part of the validation suite?

Rather than develop a "reference" compiler maybe a reference implementation
of the class library should be freely available as a first cut in testing
a new or existing compiler.
 Is there a freeware version of the entire class library available?

--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|






Author: barmar@nic.near.net (Barry Margolin)
Date: 1995/06/21
Raw View
Am I really bothering to respond to a Jim Fleming post?  Actually, in this
case his question isn't all that unreasonable.  We discussed a similar
issue in the Common Lisp committee.

In article <3s9ict$ap7@News1.mcs.com> jim.fleming@bytes.com (Jim Fleming) writes:
>With the recent discussion in this group on the "Pentium-like" bug in some
>of the C++ compilers, it seems like it might be useful to have a compiler
>which is agreed to be the "reference implementation".

And what happens if there's a bug in the reference implementation?  Is
everyone then required to copy that bug, even though it contradicts the
written standard?

Another problem is deciding whose implementation should be used as the
reference.  The vendor whose implementation is chosen will have a
significant competitive advantage in the market -- no customer could ever
ask for their money back because the product doesn't conform.  The
alternative to this is developing an entirely new implementation just for
this purpose, but who is going to do all that work?  A C++ implementation
is a non-trivial project, and most existing implementations have taken
years to develop (plus, most were derived from C implementations, so you
have to include the time it took to implement that).

What language would the reference implementation be written in?  If it's
written in C++, then the correctness of the reference implementation is
dependent on the correctness of the C++ implementation used to compile it.

A more common technique used for determining whether an implementation
conforms to a standard is a test suite.  Are there any official language
standards that have a reference implementation?

>With other standards, like time and weight and measurements, people keep
>a reference so that people can always stay in synch with reality. Is this
>same approach possible with a computer language?

Time and weight standards don't use references these days.  They are
generally defined in terms of fundamental properties of the universe.  For
instance, I believe the meter is defined as N wavelengths of the light
emitted by element X, and the second is defined as the time it takes an
atom of element Y to vibrate M times.

>If I recall, the word "foot" comes from the fact that some king or pope's
>foot was used as the "reference" for defining what 12 inches meant to
>the common people.

Standards for weights and measures used to be defined from references.  For
instance, the US Bureau of Weights and Measures used to have a bar of metal
with two markings whose separation was the official definition of a foot.
They probably still have it, but it's only for show now.  As the precision
in scientific measurement increased, standards like these simply became
unusable (consider the imprecision resulting from the width of the markings
on the bar).

The official definition of a foot is now precisely 12x0.0254 meters, using
the above definition of a meter.
--
Barry Margolin
BBN Planet Corporation, Cambridge, MA
barmar@{bbnplanet.com,near.net,nic.near.net}
Phone (617) 873-3126 - Fax (617) 873-5124





Author: Ian Wild <ian@cfmu.eurocontrol.be>
Date: 1995/06/22
Raw View
jim.fleming@bytes.com (Jim Fleming) wrote:
>
>When the ANSI C++ Language standard is complete will there be a "reference"
>implementation? In other words, a compiler which will provide the "final
>word" in resolving semantic debates and other issues.
>
>With the recent discussion in this group on the "Pentium-like" bug in some
>of the C++ compilers, it seems like it might be useful to have a compiler
>which is agreed to be the "reference implementation".

Will this reference compiler be bug free?  If not, do all other compilers
have to duplicate its bugs?  And will there be a reference compiler for
each possible platform, or just the one?  If a reference compiler were
running on an early Pentium, would all other machines be forced to copy
its floating point behaviour?

If the reference compiler _were_ bug free, the must have been some way
of verifying this by checking against the Standard.  It would seem, then,
more reasonable to use this same checking procedure to verify a new
compiler.  The alternative, trying to compile every legal program with
both the test and the reference compilers and doing a diff on the results,
sounds a little tedious.

>
>With other standards, like time and weight and measurements, people keep
>a reference so that people can always stay in synch with reality.

Where is the standard Second kept?

Ian







Author: dbinder@bnr.ca (David Binderman)
Date: 1995/06/22
Raw View
Ian Wild (ian@cfmu.eurocontrol.be) wrote:
: If the reference compiler _were_ bug free, the must have been some way
: of verifying this by checking against the Standard.  It would seem, then,
: more reasonable to use this same checking procedure to verify a new
: compiler.  The alternative, trying to compile every legal program with
: both the test and the reference compilers and doing a diff on the results,
: sounds a little tedious.

I've always thought that C++ should go the way of Ada.

Ie if you have a compiler, you cannot call it a C++ compiler until
it passes some reference test suite.

Of course, I've heard Ada users say that there's not enough in their
reference test suite.

This could be a profitable branding opportunity for a small company.

I'm tired of having to manually adjust my code for each new C++
compiler that comes along, not only for what version of C++ the
compiler claims to implement, but also the set of bugs in the compiler.

My vote for a reference compiler would be cfront 3. Buggy, but at
least its easy to port, almost universal in at least the UNIX world,
and the language it implements is simple enough for us mortals to
understand and get something done with.

As things stand now, I'm still waiting for most UNIX compilers to
get up to the standard presented in the ARM. Given the
progress in UNIX C++ compilers I've seen since 1990, ie almost none,
I figure 2005 at the earliest for ANSI C++ compilers on UNIX being
common enough to be usable.

Sigh.


David C Binderman MSc BSc       dbinder@bnr.co.uk         +44 1628 794 887
Object Oriented Design & Analysis with C++ since 1988
                      Code policeman, language lawyer and Purify advocate





Author: clamage@Eng.Sun.COM (Steve Clamage)
Date: 1995/06/22
Raw View
In article drv@bmdhh222.bnr.ca, dbinder@bnr.ca (David Binderman) writes:
>I've always thought that C++ should go the way of Ada.
>
>Ie if you have a compiler, you cannot call it a C++ compiler until
>it passes some reference test suite.

Test suites for C++ do exist, and compiler vendors typically use at
least one of them as part of their normal QA procedures.

>Of course, I've heard Ada users say that there's not enough in their
>reference test suite.

There are problems with "official" test suites. If a government body
says what is "official" what happens when another government designates
a different suite as "official" and no compiler can pass both suites
because the suites disagree on some issues. Also, official bodies tend
to pick test suites based on political and economic considerations,
rather than solely on technical merit.

>
>I'm tired of having to manually adjust my code for each new C++
>compiler that comes along, not only for what version of C++ the
>compiler claims to implement, but also the set of bugs in the compiler.

The purpose of the standard is to provide a reference for the meaning
of well-formed programs. The definition of C++ has been changing over
the years, and should stabilize with the publication of the standard.

>My vote for a reference compiler would be cfront 3. Buggy, but at
>least its easy to port, almost universal in at least the UNIX world,
>and the language it implements is simple enough for us mortals to
>understand and get something done with.
>
>As things stand now, I'm still waiting for most UNIX compilers to
>get up to the standard presented in the ARM.

Aren't you contradicting yourself on every point? You want bug-free
compilers, but you want buggy cfront 3.0 to be the standard. You want
compilers to catch up to the ARM, but cfront 3.0 does not implement
the ARM, nor does it have any of the newer features added by the
C++ committee.

>Given the
>progress in UNIX C++ compilers I've seen since 1990, ie almost none,

What? 1990 marked cfront 2.1: No templates, no exceptions, no other
modern C++ features. Debugging meant trying to understand the C code
generated by cfront, since you typically could not find a debugger that
knew anything about C++. You can buy off the shelf today (or get for free
in the case of g++) Unix compilers for almost any platform which implement
more of C++ than cfront 3.0, which have fewer bugs, and which offer
debugging support ranging from acceptible to excellent.

>I figure 2005 at the earliest for ANSI C++ compilers on UNIX being
>common enough to be usable.

Unix compilers available today offer all of the ARM features, updated
by C++ Committee enhancements. The level of support of the newest
features varies. Assuming the standard is near closure by next year,
I expect commercial C++ compilers on all major Unix platforms,
supporting the standard in 1996. I will be very surprised indeed if
in 1997 you have trouble finding a standard-conforming C++ compiler for
any common platform.

---
Steve Clamage, stephen.clamage@eng.sun.com







Author: dbinder@bnr.ca (David Binderman)
Date: 1995/06/23
Raw View
Steve Clamage (clamage@Eng.Sun.COM) wrote:
: There are problems with "official" test suites. If a government body
: says what is "official" what happens when another government designates
: a different suite as "official" and no compiler can pass both suites
: because the suites disagree on some issues. Also, official bodies tend
: to pick test suites based on political and economic considerations,
: rather than solely on technical merit.

But it doesn't have to be an official test suite, and governments
don't have to be involved. Didn't X-Open have something similar for
UNIX/1170 ? Ie if an operating system passed the 1,170 tests then it
could be called UNIX.

Also, I think having a standard test suite, even if technically not of
the highest quality, is better than having none.

: The purpose of the standard is to provide a reference for the meaning
: of well-formed programs. The definition of C++ has been changing over
: the years, and should stabilize with the publication of the standard.

Yes, but the standard is a *long* way from what folks are using now,
and what compilers do now. I can easily see the fight to implement
ANSI C++ destroying wide spread industrial use of C++.

I can also easily see that the complexities of the post-ARM features
make C++ too complex for widespread use.

: >My vote for a reference compiler would be cfront 3. Buggy, but at
: >least its easy to port, almost universal in at least the UNIX world,
: >and the language it implements is simple enough for us mortals to
: >understand and get something done with.
: >
: >As things stand now, I'm still waiting for most UNIX compilers to
: >get up to the standard presented in the ARM.

: Aren't you contradicting yourself on every point? You want bug-free
: compilers, but you want buggy cfront 3.0 to be the standard. You want
: compilers to catch up to the ARM, but cfront 3.0 does not implement
: the ARM, nor does it have any of the newer features added by the
: C++ committee.

In an ideal world, bug free software is nice, but realistically
balancing a set of different compilers is made easier by
just using one compiler, ported to different boxes.

Having compilers migrate to the ARM would be nice, but controlling the
education & use of the new features (ie templates & exceptions)
while the compilers get debugged makes me think we'll just not
bother with those features until they work right.

: >Given the
: >progress in UNIX C++ compilers I've seen since 1990, ie almost none,

: What? 1990 marked cfront 2.1: No templates, no exceptions, no other
: modern C++ features. Debugging meant trying to understand the C code
: generated by cfront, since you typically could not find a debugger that
: knew anything about C++.

My mistake, I think cfront 3.0 was June 1991, not 1990. That compiler
is still nearly universal over here in UK even now in 1995.

: You can buy off the shelf today (or get for free
: in the case of g++) Unix compilers for almost any platform which implement
: more of C++ than cfront 3.0, which have fewer bugs, and which offer
: debugging support ranging from acceptible to excellent.

Maybe in America, but over here in UK compilers are a thousand pounds or more
($1500 ?) a seat. They don't get changed often.

G++ might claim to implement more of C++ than cfront 3.0, but I
know recent releases (pre 2.7.0) have not had the quality associated
with commercial compilers.

However, preliminary tests with G++ 2.7.0 are favourable so far.

: >I figure 2005 at the earliest for ANSI C++ compilers on UNIX being
: >common enough to be usable.

: Unix compilers available today offer all of the ARM features, updated
: by C++ Committee enhancements.

Some UNIX compilers might, but I've never seen one.

Regards

David C Binderman MSc BSc       dbinder@bnr.co.uk         +44 1628 794 887





Author: jim.fleming@bytes.com (Jim Fleming)
Date: 1995/06/21
Raw View
When the ANSI C++ Language standard is complete will there be a "reference"
implementation? In other words, a compiler which will provide the "final
word" in resolving semantic debates and other issues.

With the recent discussion in this group on the "Pentium-like" bug in some
of the C++ compilers, it seems like it might be useful to have a compiler
which is agreed to be the "reference implementation".

With other standards, like time and weight and measurements, people keep
a reference so that people can always stay in synch with reality. Is this
same approach possible with a computer language?

If I recall, the word "foot" comes from the fact that some king or pope's
foot was used as the "reference" for defining what 12 inches meant to
the common people. Some people have told me that C++ has a standard which
is called an "arm"..:) Unfortunately, it is hard to tell how the foot and
the arm are connected but I am confident that some "heads" will figure
that out..:)

...maybe what is needed is a "body" to connect the "foot" and the "arm"
and then the "heads" could be placed on top of this body...the result
could be called a "monster" and this could be the reference implementation..

...just think...packages could say...
 ..."conforms to the Monster ANSI C++ Standard..."

--
Jim Fleming            /|\      Unir Corporation       Unir Technology, Inc.
jrf@tiger.bytes.com  /  | \     One Naperville Plaza   184 Shuman Blvd. #100
%Techno Cat I       /   |  \    Naperville, IL 60563   Naperville, IL 60563
East End, Tortola  |____|___\   1-708-505-5801         1-800-222-UNIR(8647)
British Virgin Islands__|______ 1-708-305-3277 (FAX)   1-708-305-0600
                 \__/-------\__/       http:199.3.34.13 telnet: port 5555
Smooth Sailing on Cruising C+@amarans  ftp: 199.3.34.12 <-----stargate----+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\____to the end of the OuterNet_|

P.S. We could give the monster some bullet-proof shoes with steel toes...:)