Topic: New proposal: Optimizer hints
Author: jbuck@synopsys.com (Joe Buck)
Date: 24 Mar 1995 22:42:56 GMT Raw View
>>>>>> "RFG" == Ronald F Guilmette <rfg@rahul.net> writes:
RFG> I still hold out some vague hope that the C++ standardization
RFG> committee will adopt the `restricted pointers' proposal which has
RFG> been approved by the international C standardization committee
RFG> (WG14) for inclusion into a forthcoming Technical Report (and
RFG> probably into the next revision of the C standard).
dag@control.lth.se (Dag Bruck) writes:
>I think that is a good reason for the C++ committee (WG21) to take it
>easy and _follow_ the C committee, instead of rushing ahead. If
>restricted pointers are adopted in C, C++ should follow i.m.o.
I would agree with one important caveat: you want to avoid the situation
where C and C++ diverge, in the sense that C and C++ are both extended
to solve some problem, but the extension is done in a different manner.
A later reunification would wind up with two ways of doing the same thing.
The only case I know of where this is actually happening is that the
Numerical C folks are adding a complex builtin type and C++ has a standard
complex class, but others may develop.
--
-- Joe Buck <jbuck@synopsys.com> (not speaking for Synopsys, Inc)
Phone: +1 415 694 1729
Author: "Ronald F. Guilmette" <rfg@rahul.net>
Date: 26 Mar 1995 22:20:14 GMT Raw View
In article <3kvhtg$q4s@hermes.synopsys.com>,
Joe Buck <jbuck@synopsys.com> wrote:
>
>... you want to avoid the situation
>where C and C++ diverge, in the sense that C and C++ are both extended
>to solve some problem, but the extension is done in a different manner.
>A later reunification would wind up with two ways of doing the same thing.
>The only case I know of where this is actually happening is that the
>Numerical C folks are adding a complex builtin type and C++ has a standard
>complex class, but others may develop.
In the case of the proposed complex built-in type for C, the folks who
have done the work on that have (I believe) been very mindful of seeking
to minimize any unnecessary separation and/or differentiation between
C and C++ in this area. As a matter of fact, I seem to vaguely recall
that the folks who did the work on this (primarily Jim Thomas?) actually
prototyped this proposed new C language feature in C++. From this fact,
I believe that it may be construed that whatever is proposed in the way
of a complex type for C will be implementable also in C++ (although perhaps
as an ordinary library class, as opposed to a built-in type).
--
-- Ron Guilmette, Sunnyvale, CA ---------- RG Consulting -------------------
---- E-mail: rfg@segfault.us.com ----------- Purveyors of Compiler Test ----
-------------------------------------------- Suites and Bullet-Proof Shoes -
Author: dag@control.lth.se (Dag Bruck)
Date: 22 Mar 1995 07:07:26 GMT Raw View
>>>>> "RFG" == Ronald F Guilmette <rfg@rahul.net> writes:
RFG> I still hold out some vague hope that the C++ standardization
RFG> committee will adopt the `restricted pointers' proposal which has
RFG> been approved by the international C standardization committee
RFG> (WG14) for inclusion into a forthcoming Technical Report (and
RFG> probably into the next revision of the C standard).
I think that is a good reason for the C++ committee (WG21) to take it
easy and _follow_ the C committee, instead of rushing ahead. If
restricted pointers are adopted in C, C++ should follow i.m.o.
-- Dag Bruck
Author: markg@athens.emi.net (Mark Grosberg)
Date: Fri, 17 Mar 1995 19:14:27 GMT Raw View
Tom O Breton (tob@world.std.com) wrote:
: mstankus@oba.ucsd.edu (Mark Stankus) writes:
: >
: I had essentially the same idea a while back. I intended to use it in
: another language project, but since you mention it, let's compare notes.
So did I. My idea was to let the code it's self compile it's self.
Basically, the idea I had was the ability to execute code at compile
time, similar to immediate words in FORTH, by doing this, routines could
optimize special cases, constants, and even more complicated things by
taking the time to put conditionals and code generation sprinkles in the
routines. The other idea was that things like intrinsic operators become
functions, just like anything else, and they call the lowest form of the
code generator (pipe out an assembly instruction) to generate all of the
opcodes, this almost makes the compiler totally portable, sans register
allocation which I had put into the compile to "allocate" registers for
the assemler output.
But I never implemented the idea in code, sigh. Too busy.
: IMO this is particularly handy with inlined functions, since you can get
: optimizations without breaking up logical units.
Yup, then the compiler can look across inlined function boundaries and
optimize for the greater good.
: I also figured that a very useful mechanism in conjunction with this is
: to be able to guarantee that intermediate functions have no side
: effects. That is, if we can guarantee the foo does not read or write to
: x:
: {
: x.set(A);
: foo();
: x.set(B);
: }
: We can optimize to:
: {
: x.set(A|B);
: foo();
: }
: Which is easy if x is a local variable, but if x is a member of
: something foo can see (including the global namespace) it needs to be
: explicit.
I like that, especially useful for automatic parallelization.
L8r,
Mark G.
Author: "Ronald F. Guilmette" <rfg@rahul.net>
Date: 18 Mar 1995 10:07:30 GMT Raw View
In article <3k705j$2c6@wcap.centerline.com>,
David Chase <chase@centerline.com> wrote:
>...
>3. This is probably subsumed by a different hint, namely "there
> is no aliasing here"...
I still hold out some vague hope that the C++ standardization committee
will adopt the `restricted pointers' proposal which has been approved by
the international C standardization committee (WG14) for inclusion into
a forthcoming Technical Report (and probably into the next revision of
the C standard).
The importance of restricted pointers to serious programming, especially
that dealing heavily with arrays and matricies, cannot be overstated.
--
-- Ron Guilmette, Sunnyvale, CA ---------- RG Consulting -------------------
---- E-mail: rfg@segfault.us.com ----------- Purveyors of Compiler Test ----
-------------------------------------------- Suites and Bullet-Proof Shoes -
Author: pstemari@erinet.com (Paul J. Ste. Marie)
Date: Sat, 18 Mar 95 17:38:39 GMT Raw View
In article <D5I4At.FDs@ucc.su.OZ.AU>,
maxtal@Physics.usyd.edu.au (John Max Skaller) wrote:
:In article <3k5odg$178@news.erinet.com>,
:Paul J. Ste. Marie <pstemari@erinet.com> wrote:
:>In article <81299@sdcc12.ucsd.edu>,
:> mstankus@oba.ucsd.edu (Mark Stankus) wrote:
:>:...
:>: forall(const Polynomial & a)
:>: forall(const Polynomial & b)
:>: forall(const Polynomial & c)
:>: is_equivalent( (a+b)+c, a+(b+c) );
:>
:>It's an interesting idea, nonetheless. Giant ugly pragma's would
:>seem to be the right place for it.
:
:Gak! The main problem with Paul's proposal is that it isn't
:general enough.
Not my proposal. Check the attribution more carefully.
Author: tob@world.std.com (Tom O Breton)
Date: Mon, 20 Mar 1995 00:57:39 GMT Raw View
markg@athens.emi.net (Mark Grosberg) writes:
> So did I. My idea was to let the code it's self compile it's self.
> Basically, the idea I had was the ability to execute code at compile
> time, similar to immediate words in FORTH, by doing this, routines could
> optimize special cases, constants, and even more complicated things by
> taking the time to put conditionals and code generation sprinkles in the
> routines. The other idea was that things like intrinsic operators become
> functions, just like anything else, and they call the lowest form of the
> code generator (pipe out an assembly instruction) to generate all of the
> opcodes, this almost makes the compiler totally portable, sans register
> allocation which I had put into the compile to "allocate" registers for
> the assemler output.
I concluded that to actually emit efficient code, more is needed. The
layer of indirection provided by optimizer hints is welcome, but
probably insufficient.
For instance, pipelining works best if operations are interleaved in a
certain order. Caching works best if data is looked at in a certain
order. And so forth.
Also, all different address modes, different combinations of source
registers, etc. must be handled. For instance, how many versions would
the archetypical int operator+ ( int, int ) need, and how would it
select them?
Also, you need some way of letting new variables come into being,
sometimes as register variables, for register caching and for
temporaries.
Tom
--
tob@world.std.com
TomBreton@delphi.com: Author of The Burning Tower
Author: jason@cygnus.com (Jason Merrill)
Date: 17 Mar 1995 06:37:29 GMT Raw View
>>>>> John Max Skaller <maxtal@Physics.usyd.edu.au> writes:
> There is a simple level of this kind of thing which can be
> easily added to C++ (or any similar language) which _also_
> provides enforcement: "Attributes". For example
> a "pure" function is one which depends only on its parameters.
As implemented by g++'s __attribute__ ((const)) or HP's
#pragma NO_SIDE_EFFECTS...
I've considered adding __attribute__s reflexive, commutative, transitive,
etc, to g++, but it's never gotten beyond idle speculation.
Jason
Author: JdeBP@jba.co.uk (Jonathan de Boyne Pollard)
Date: 17 Mar 1995 16:41:00 -0000 Raw View
John Max Skaller (maxtal@Physics.usyd.edu.au) wrote:
: There is a simple level of this kind of thing which can be
: easilyy added to C++ (or any similar language) which _also_
: provides enforcement: "Attributes".
Just a question of terminology while I read. Do you mean by "attributes" the
attributes that I am used to (public interface data members of a class
with implicit accessor methods), or is this Yet Another Meaning For An
OO Term ?
Author: mstankus@oba.ucsd.edu (Mark Stankus)
Date: 14 Mar 95 22:49:14 GMT Raw View
Hi,
The following is a LaTeX document for a possible
extension of C++. Most of it is ``ascii-like"
so that you can read it on the screen. The code
examples are in verbatim.
The dvi, postscript and tex documents are accessable via www by
mosaic http://math.ucsd.edu/~mstankus/optimizer_hints.dvi
mosaic http://math.ucsd.edu/~mstankus/optimizer_hints.ps
mosaic http://math.ucsd.edu/~mstankus/optimizer_hints.tex
The idea is pretty simple: optimization is limited for
overloaded operators for classes unless the
operations are inline-able. The key is to provide a
mechanism for telling the optimizer theoretical type
statements about what is true
about a piece of C++ code. There are other benefits too
(like "solving" the composite operator problem.)
Thanks for your time,
Mark Stankus
\documentstyle[epsf,leqno]{article}
\oddsidemargin0in
\evensidemargin0in
\textwidth6in
\title{Optimizer Hints, Composite Operators
%\today
}
\author{ M. Stankus \thanks{Department of
Mathematics,
University of California, San Diego,
California}
}
\date{}
\begin{document}
\maketitle
%\pagenumbering{roman}
%\listoffigures
\epsfverbosetrue
\epsfxsize=3in
%\epsfxsize=\textwidth
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%% MACROS %%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\newtheorem{example}[equation]{Example}
\newtheorem{result}[equation]{Result}
\newtheorem{th}{Theorem}
\newtheorem{cor}{Corollary}
\newtheorem{defn}{Definition}
\newtheorem{prop}{Proposition}
\newtheorem{open}{Open Question}
\newtheorem{conj}{Conjecture}
\newtheorem{lemma}{Lemma}
\newcommand{\cir}{\mbox{$\bf \partial D$} }
\newcommand{\e}{\mbox {$e^{i \theta}$}}
\newcommand{\defin}{\mbox{ $\stackrel{\rm def}{=}$ }}
\newcommand{\uncopt}{\mbox{$\bf UNCOPT$} }
\newcommand{\opt}{\mbox{\bf OPT} }
\newcommand{\opta}{\mbox{$\bf OPT_{A_N}$} }
\newcommand{\opte}{\mbox{$\bf OPT_{E}$} }
\def\theequation{\thesection.\arabic{equation}}
\def\openmtx#1#2{\renewcommand{\arraystretch}{1.2}%
\left. \begin{array}{#1}#2\end{array} \right. }
\def\RR{{\bf R}}
\def\CC{{\bf C}}
\def\TT{{\bf T}}
\def\DD{{\bf D}}
\def\scr#1{{\cal #1}}
\def\ro#1{{\rm(#1)}}
\def\bo#1{{\bf(#1)}}
\def\endproof{\qquad\vrule height4pt width4ptdepth0pt\medskip}
\def\OPTp{$ OPT^p$}
\def\OPTpbig{$\hbox{\mbig OPT\/}_{\displaystyle
w}^{\displaystyle p}$}
\def\OPTo{$ OPT_w^1$}
\def\OPTi{$ OPT^\infty$}
\def\OPTinf{$ OPT^\infty$}
\def\OPTinfbig{$\hbox{\mbig OPT\/}^{\hbox{\sbig \char'61}}$}
\def\ROPTbig{$\hbox{\mbig ROPT\/}^{\hbox{\sbig \char'61}}$}
\def\ROPT{$ ROPT^\infty$}
\def\OPT1{$OPT^1$}
\def\Hinf{{\mbox{$H^\infty$}}}
\baselineskip=12pt
\section{Introduction}
I propose that there be a new type of
design decision information incorporated in
the interface to a class. This would resemble
other types of sections in a class interface
(public:, private:, protected:), but would be related to
giving the optimizer hints about how
to optimize the code. This grew out of two
real problems. See the two sections following
this introduction.
I have a question in my mind as to whether this
should be part of the regular source code
or provided as some kind of auxiliary file to the
compiler.
I have included a large number of little sections near
the end which answer the specific questions which
are brought up in Section 4 of the ``Design and
Evolution of C++" book.
I have tried to keep the hype to a minimum.
Please let me know what you think, positive or
negative. If positive, what is the next informal step?
If the response is basically that ``it sounds good, but
I don't want to make it part of the standard, maybe it
should be an extra file read in by the compiler
via a
\begin{verbatim}
-optimizer_hints optimizer_file.h
\end{verbatim}
option on the compiler", then let
me know that.
My e-mail is mstankus@oba.ucsd.edu.
\bigskip
\noindent
Thanks, \\
Mark Stankus
\tableofcontents
\part{Real problems}
\section{A Real Problem: Composite Operators}
If one has a matrix class,
it is tempting to try to optimize code such as
\begin{verbatim}
Matrix m, n, p, q;
// ... code to fill m, n and p
q = m + n * p;
\end{verbatim}
One obvious way is to have a function
\begin{verbatim}
class Matrix
{
//.. code
public:
void fastEqualPlusTimes(const Matrix & m, const Matrix & n, const Matrix & p);
// ... code
};
\end{verbatim}
and type
\begin{verbatim}
Matrix m, n, p, q;
// ... code to fill m, n and p
q.fastEqualPlusTimes(m, n, p);
\end{verbatim}
This is pretty ugly. What one could do though is to {\it tell} the {\it optimizer}
to change
\begin{verbatim}
q = m + n * p;
\end{verbatim}
into
\begin{verbatim}
q.fastEqualPlusTimes(m, n, p);
\end{verbatim}
The user could type something which looked nice and get the faster code.
For the purposes of discussion, let us suppose that
we have new keywords
\begin{verbatim}
optimizer_hint, forall, simplify_code, equivalent_code
\end{verbatim}
An example of what this might look like:
\begin{verbatim}
class Matrix
{
// ... regular code
// a new section in the interface to a class
optimizer_hint:
// a hint for the compiler... there is no implementation counterpart
forall(const Matrix & m)
forall(const Matrix & n)
forall(const Matrix & p)
forall(Matrix & q)
simplify_code(q=m+n*p, q.fastEqualPlusTimes(m, n, p) );
};
\end{verbatim}
\section{A Real problem: Optimization: Built-in types get optimized more than classes}
When a class has inlined overloaded operators,
the optimizer can attempt to optimize the code aggressively.
However, suppose that a class
has lots of member variables and arithmetic
does not lend itself to inlining, for example the case of a polynomial.
In that case the
operator + would not be inlined (the code would involve a for loop)
and so the optimizer would not be very aggressive about optimization.
However, since addition is commutative and associative for polynomials,
the code
\begin{verbatim}
q = p + r - p - r + w;
\end{verbatim}
is equivalent to
\begin{verbatim}
q = w;
\end{verbatim}
Again. we should {\it tell} the {\it optimizer} this.
For example,
\begin{verbatim}
class Polynomial
{
public:
// ...code...
optimizer_hint:
// the next line means if you have a+(-a)+b then use b instead
forall(const Polynomial & a)
forall(const Polynomial & b)
simplify_code(a+ -a +b, b);
// the next line says that you can switch a+b for b+a
forall(const Polynomial & a)
forall(const Polynomial & b)
is_equivalent(a+ b, b+a);
//note: with some work on the part of the compiler writers,
// the above two hints would do the following automatically also.
// See the results in the field of Computer Science called
// Theorem Proving.
//forall(const Polynomial & a)
// forall(const Polynomial & b)
// simplify_code(b+a+ -a, b);
//forall(const Polynomial & a)
// forall(const Polynomial & b)
// simplify_code(a+b+ -a, b);
// associativity
forall(const Polynomial & a)
forall(const Polynomial & b)
forall(const Polynomial & c)
is_equivalent( (a+b)+c, a+(b+c) );
}
\end{verbatim}
I now continue with a short laundry list of other examples:
Now tell the optimizer that -(-a) is a:
\begin{verbatim}
class Polynomial
{
// ... as above ...
optimizer_hint:
// ... as above ...
// the next line does -(-a) can be simplified to a
forall(const Polynomial & a)
simplify_code(-(-a), a);
};
\end{verbatim}
Let the compiler know that a+(-a) is the
default value Polynomial() (with wrinkles :--) ):
\begin{verbatim}
class Polynomial
{
//... as above ...
optimizer_hint:
//.. as above ...
// How to deal with the a+-a is 0 statement
// the next line means if you have a+(-a) then use Polynomial() instead
forall(const Polynomial & a)simplify_code(a+ -a, Polynomial() );
// but now...
forall(const Polynomial & a) simplify_code(a+Polynomial(), a);
// for things like 'Polynomial p(q-q);'
simplify_code(Polynomial(Polynomial() ), Polynomial() );
};
\end{verbatim}
Now for a standard optimizer trick:
\begin{verbatim}
class Polynomial
{
//... as above ...
optimizer_hint:
//... as above ...
//eliding-constructors (g++)
forall(Polynomial a)
forall (const Polynomial & b)
simplify_code(Polynomial a = b, Polynomial a(b) );
};
\end{verbatim}
Avoiding unnecessary copying for arithmetic operators:
\begin{verbatim}
class Polynomial
{
// ... as above ...
void fastAdd(const Polynomial & a, const Polynomial & b);
optimizer_hint:
// ... as above ...
// for avoiding unnecessary copying
forall(const Polynomial & a)
forall(const Polynomial & b)
forall(Polynomial & c)
simplify_code(c=a+b, c.fastAdd(a, b) );
};
\end{verbatim}
\part{Standard questions}
\section{Should it be part of the language?}
I was thinking of three alternatives:
\begin{description}
\item{(1)}
{
It could be part of the interface. The deficit is that it is unlike other
parts of the interface in that there is no corresponding code in the
implementation. It does document the design decisions
which the author of the program intended.
}
\item{(2)}
{
It could be a different type of source file which the compiler reads in.
}
\item{(3)}
{
What do you suggest?
}
\end{description}
\section{Will it break existing code?}
I don't think so. Since it is a new section of the interface, the only problem is the introduction
of new keywords. This could be gotten around by
\begin{description}
\item{(1)}
{
Well chosen keywords
}
\item{(2)}
{
Localized scoping on the keywords
}
\item{(3)}
{
What do you suggest?
}
\end{description}
\section{Does it force a programming style? No}
The hints are not required so one can totally ignore them.
\section{Are optimizer hints affordable? Yes}
If hints are included, they are ignored unless the optimizer
is running. Since the hints are for the optimizer, the
compilations will probably take longer and the run
times shorter.
\section{Transition path? Yes}
Since hints are only suggestions, compiler writers could add
them at will. An implementation would
follow along the lines:
\begin{description}
\item{(1)}
{
Update the parser so that the hints can be added but ignored.
}
\item{(2)}
{
Build good overloading resolution.
}
\item{(3)}
{
Update the optimizer as much as desired.
}
\item{(4)}
{
Update the documentation as to which hints actually do something.
Repeat by going to step 3.
}
\end{description}
\section{Support sound design decisions? Yes}
Optimizer hints allow/encourage the designer to
document his design decisions into the class that
is being implemented.
\section{Say what you mean? Yes}
Optimizer hints are hints. They may or may not be used.
Optimizer hints allow one to implement composite
operators.
\section{Optimizer Hints lead to new bugs or not? (Not)}
It is possible that a person could produce a bug
by introducing a incorrect optimizer hint.
They could find it by turning the optimizer off.
Personally, I think that since writing hints
is easy and has a high correlation to design
decisions, having optimizer hints make a
program crash gives a big hint that
there is a design flaw.
For example, it would provide an easy
check that writing a composite operator
\begin{verbatim}
.=.*.+.( ... )
\end{verbatim}
corresponds to the equivalent code with $+$, $*$ and $=$.
\section{The type system}
If one wants to indicate that addition of polynomials is
a commutative operation, then a optimizer hint might
look like the following.
\begin{verbatim}
class Polynomial
{
// .. code ..
optimizer_hint:
forall(const Polynomial & a)
forall(const Polynomial & b)
equivalent_code(a+b, b+a);
}
\end{verbatim}
The programmer would have to supply type information
and that type information is needed by the optimizer.
\section{Order Dependencies? No}
The optimizer hints could be in any number of
places in the interface, just as public: and
private: are.
\section{Easy to teach? Yes, but maybe no}
Well, it would be easy to teach mathematicians
and computer scientists. The formal notion of forall
might be new to some programmers. It should
be easier to teach than virtual functions.
\section{Preprocessor? No}
This has nothing to do with the preprocessor.
\section{Does it cost you if you don't use it? No}
If there were any overhead at all for supplying the
feature and not using it, it would be in compiler time
(unless the implementer of C++ broke some optimization
feature when adding optimizer hints).
\section{Using with inheritence and Libraries}
The example in this section might be considered
weird. I don't know if the code below
would actually compile.
If the optimizer hints were part of the interface, then
they would be inherited. One could then write classes
which contained nothing but optimizer hints
(Hope this is not a repulsive idea...).
Using little classes with multiple inheritence
is common practice, right?
One could write classes which gave optimizer hints
and have optimizer hint libraries.
The following example might be in a library for
algebraists.
\begin{verbatim}
// nothing but optimizer hints!!!
template<class Scalars, class Vectors>
class VectorSpaceHypothesis
{
optimizer_hints:
forall(const Vectors & a)
forall(const Vectors & b)
is_equivalent(a+b, b+a);
forall(const Vectors & a)
forall(const Vectors & b)
forall(const Vectors & c)
is_equivalent( (a+b)+c, a+(b+c) );
forall(const Vectors & a)
forall(const Vectors & b)
forall(const Scalars & c)
is_equivalent(c*(a+b), c**a+c**b);
// etc.
forall(const Vectors & a)
forall(const Vectors & b)
simplify_code(c=a+b, c.fastAdd(a, b) );
};
// g++ Rational
class Rational;
// needs to be predeclared due to below inheritence
class Polynomial;
// multiple inheritence
class Polynomial :
public VectorSpaceHypothesis<Rational, Polynomial>,
public otherBaseClass
{
public:
// ... constructors and such ...
// THE FOLLOWING FUNCTION WILL ALWAYS BE OPTIMIZED AWAY
Polynomial operator + (const Polynomial & aPoly);
// like (*this) = first + second;
void fastAdd(const Polynomial & first,
const Polynomial & second);
// etc...
};
\end{verbatim}
\section{Difficulty to implement? Could do in steps.}
Once the ideas and syntax is decided, one just has to begin
by parsing the text correctly and checking overloading
type matching and such (e.g., we must have defined
operator + in the above examples), but since they are
only hints, a minimal implementation would not
have to do anything more. I figure that people who
devote their lives to optimization techniques
and/or having computers prove theorems could then run with the idea.
Of course, people might become hungry for
optimizer enhancements if they start writing
optimizer hints and the hints are not implemented.
\section{Program verification}
If people start writing in optimizer hints, all of the
source code including the hints could be fed into
program checkers and verifiers. Perhaps the hints could be
proven (mathematically) correct.
Perhaps they could be used as direction for the
program verifier.
\part{Closing}
The use of such optimizer hints would tie into the field of
computer algebra and Theorem Proving.
See, for example, the ideas of Gr\"obner
basis. I could get some references to the material.
Should there be other stuff in the language which supports
design information?
\bigskip
\noindent
Mark Stankus
\noindent
\begin{verbatim}
http://math.ucsd.edu/~mstankus/
mstankus@oba.ucsd.edu
\end{verbatim}
\end{document}
\end
Author: pstemari@erinet.com (Paul J. Ste. Marie)
Date: Wed, 15 Mar 95 03:51:22 GMT Raw View
In article <81299@sdcc12.ucsd.edu>,
mstankus@oba.ucsd.edu (Mark Stankus) wrote:
:...
: forall(const Polynomial & a)
: forall(const Polynomial & b)
: forall(const Polynomial & c)
: is_equivalent( (a+b)+c, a+(b+c) );
Unfortunately, this isn't true for a lot of the builtin types, let
alone user-defined ones. For example,
forall(double a)
forall(double b)
forall(double c)
is_not_at_all_equivalent( (a+b)+c, a+(b+c) );
It's an interesting idea, nonetheless. Giant ugly pragma's would
seem to be the right place for it.
Author: tob@world.std.com (Tom O Breton)
Date: Wed, 15 Mar 1995 04:35:49 GMT Raw View
mstankus@oba.ucsd.edu (Mark Stankus) writes:
>
I had essentially the same idea a while back. I intended to use it in
another language project, but since you mention it, let's compare notes.
I didn't think of it in terms of reducing formulas ("a = b + c"), but
rather in terms of substituting lists of functions for others. Which is
essentially the same thing, but a bit more general. I had the syntax of
an argument list instead of for_all( Type T ).
IMO this is particularly handy with inlined functions, since you can get
optimizations without breaking up logical units.
One other use I have is optimizations that combine invocations of a
function, EG: set(A); set(B); --> set(A|B);
More ambitiously, I also had in mind a use I call "induction", that
would require annotating intervening sections of "outside" code, where
something like:
{
/* x.seek( y ); ... x.seek( y + 1 ); --> x.seek( y ); ... x.advance(); */
x.seek( y );
//...
x.seek( y + 1 );
//...
x.seek( y + 2 );
//...
}
...could be transformed into cheaper operations:
{
x.seek( y );
//...
x.advance();
//...
x.advance();
//...
}
I also figured that a very useful mechanism in conjunction with this is
to be able to guarantee that intermediate functions have no side
effects. That is, if we can guarantee the foo does not read or write to
x:
{
x.set(A);
foo();
x.set(B);
}
We can optimize to:
{
x.set(A|B);
foo();
}
Which is easy if x is a local variable, but if x is a member of
something foo can see (including the global namespace) it needs to be
explicit.
Your idea of classes inheriting these optimizations is something that
had not occured to me, and is probably a great idea.
Tom
--
tob@world.std.com
TomBreton@delphi.com: Author of The Burning Tower
Author: chase@centerline.com (David Chase)
Date: 15 Mar 1995 15:12:51 GMT Raw View
The three problems I immediately saw with this
were:
1. Use of the forall keyword. Rather, you always used it.
Does it have any significance, or is it just syntactic
noise? Could the same purpose be served by something less
noisy?
2. The patterns that you describe are fairly easy to match,
once all the verbiage ("forall") is stripped away. That
is, this is just a SMOEquationalP (I've done work in this
sort of pattern-matching, yes). However, do you really
mean for this to happen:
q=q+q*q ==> q.fastEqualPlustimes(q,q,q)
Recall that the parameters to fastEqualPlustimes are
passed by reference, so overwriting q in this case would
be bad.
3. This is probably subsumed by a different hint, namely "there
is no aliasing here". Sometimes, in the case of temporary
arrays, you can figure it out already, but Fortran-like
aliasing information is what would be really helpful, in
many places. That should let the optimizer do this
transformation for you, in addition to many others (for
instance, how fast is your matrix multiplication? No sense
doing this unless you've already cache and register blocked
the multiplication. Not many (Fortran) compilers do that well,
but what little they do can still be a substantial improvement.
This is, perhaps, merely an artifact of your example.)
Also, you could use a better example than matrix multiplication (if
what you are selling is speed, rather than a reduction in useless
ugly temporaries), because the multiplication step dominates both
the addition and the copying. Multiplication is O(n-cubed) using
the usual algorithm, versus O(n-squared) for all the other
operations. Compared to a big multiplication, who cares about
this?
So, again, you need a better example to motivate this. You should
ensure that the time saved is not trivial compared to the other
operations that you are performing, and that the optimizations that
you enable are more useful than the ones enabled by the addition
of aliasing hints (this assumes that you know something about
those optimizations -- otherwise, you really have no business
making such a proposal).
yours,
David Chase (speaking for myself)
Author: maxtal@Physics.usyd.edu.au (John Max Skaller)
Date: Wed, 15 Mar 1995 21:42:28 GMT Raw View
In article <3k5odg$178@news.erinet.com>,
Paul J. Ste. Marie <pstemari@erinet.com> wrote:
>In article <81299@sdcc12.ucsd.edu>,
> mstankus@oba.ucsd.edu (Mark Stankus) wrote:
>:...
>: forall(const Polynomial & a)
>: forall(const Polynomial & b)
>: forall(const Polynomial & c)
>: is_equivalent( (a+b)+c, a+(b+c) );
>
>It's an interesting idea, nonetheless. Giant ugly pragma's would
>seem to be the right place for it.
Gak! The main problem with Paul's proposal is that it isn't
general enough.
There is a simple level of this kind of thing which can be
easilyy added to C++ (or any similar language) which _also_
provides enforcement: "Attributes". For example
a "pure" function is one which depends only on its parameters.
To go any further than attributes requires a full scale
inference engine and a suitable logic language.
Apparently you can do this in Prolog, using Prolog itself
as the logic language :-)
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd,
81A Glebe Point Rd, GLEBE Mem: SA IT/9/22,SC22/WG21
NSW 2037, AUSTRALIA Phone: 61-2-566-2189