Note the mention of Rails at the end of Section 2.0.
On 2/11/07, Francis C. [email protected] wrote:
http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-183.pdf
Yeah, big +1. The most thought-provoking part for me was this(from page
35):
It is striking, however, that research from
psychology has had almost no impact, despite the obvious
fact that the success of these models will be strongly
affected by the human beings whouse them. Testing
methods derived from the psychology research community
have been used to great effect for HCI, but are sorely
lacking in language design and software engineering. For
example, there is a rich theory investigating the causes of
human errors, which is well known in the human-computer
interface community, but apparently it has not penetrated
the programming model and language design community.
…
We believe that integrating research on human psychology
and problem solving into the broad problem of designing,
programming, debugging, and maintaining complex parallel
systems will be critical to developing broadly successful
parallel programming models and environments.
Keith
Keith F. wrote:
methods derived from the psychology research community
systems will be critical to developing broadly successful
parallel programming models and environments.Keith
Well … yes and no … we should probably take this to “pragprog”, and
I’m going to, but …
-
I’ve been here before – at the point where general-purpose SISD
architectures ran out of steam and special-purpose machines abounded. I
spent ten years working for a company, Floating Point Systems, that
made special-purpose machines. There’s a whole generation of people
out there, myself among them, that ended up finding other things to do
when the general-purpose SISD (and CISC) machine known as the Pentium
essentially wiped everything else off the map. So I view the current
“trend” to multicore systems and more dreams of massively parallel
computers becoming mainstream as only a temporary thing … a swing of a
pendulum to one extreme … general purpose SISD machines will be back! -
There’s an awful lot of specialized hardware in a modern PC for
audio and graphics already. By some strange coincidence, the sound card
architecture looks a lot like an FPS array processor.I’m not a
graphics geek, but I’d be willing the bet that what’s inside the
graphics chipsets looks a lot like the specialized image processing
computers folks came up with in the 1960s and 1970s. Somebody programs
these parallel and concurrent gizmos, and they obviously are getting it
right and have tools to help them get it right. -
To bring this back to Ruby, what I think Ruby needs, independent of
any trends in the underlying hardware, isa. support for all the commonly-used concurrency primitives made as
efficient as possible in the underlying implementations. Most of them
are already there, including some, like Rinda/tuplespace, that aren’t
common in other languages, andb. Efficient implementation of the low-level core types – integers,
rationals, multi-precision numbers, real and complex floating point
multi-dimensional arrays, hashes, bit vectors. In short, it should not
be necessary to escape to C – ever.
–
M. Edward (Ed) Borasky, FBG, AB, PTA, PGS, MS, MNLP, NST, ACMC§
http://borasky-research.blogspot.com/
If God had meant for carrots to be eaten cooked, He would have given
rabbits fire.
M. Edward (Ed) Borasky wrote:
[snip]
One other little piece of flame bait the fact that Berkeley has a
combined EE-CS department that produced this paper is another symptom of
what’s wrong. Computer Science has become subordinated to Electrical
Engineering. I personally think that’s very very wrong.
–
M. Edward (Ed) Borasky, FBG, AB, PTA, PGS, MS, MNLP, NST, ACMC§
http://borasky-research.blogspot.com/
If God had meant for carrots to be eaten cooked, He would have given
rabbits fire.
Francis C. wrote:
“trend” to multicore systems and more dreams of massively parallel
computers becoming mainstream as only a temporary thing … a swing of a
pendulum to one extreme … general purpose SISD machines will be back!Are you expecting another round (or more) of massive improvements in
uniprocessor performance?I can think of a number of possiblities – Lets really get out there___
1) Multivalue Logic (Nasty)
2) Extremely Long instruction words
3) Massively Deep look ahead Threads
However, Most Likely -- General Multi-core Processors with
Various Specialized Processors.
On 2/11/07, M. Edward (Ed) Borasky [email protected] wrote:
- I’ve been here before – at the point where general-purpose SISD
architectures ran out of steam and special-purpose machines abounded. I
spent ten years working for a company, Floating Point Systems, that
made special-purpose machines. There’s a whole generation of people
out there, myself among them, that ended up finding other things to do
when the general-purpose SISD (and CISC) machine known as the Pentium
essentially wiped everything else off the map. So I view the current
“trend” to multicore systems and more dreams of massively parallel
computers becoming mainstream as only a temporary thing … a swing of a
pendulum to one extreme … general purpose SISD machines will be back!
Are you expecting another round (or more) of massive improvements in
uniprocessor performance?
On 2/11/07, M. Edward (Ed) Borasky [email protected] wrote:
M. Edward (Ed) Borasky wrote:
[snip]
One other little piece of flame bait
the fact that Berkeley has a
combined EE-CS department that produced this paper is another symptom of
what’s wrong. Computer Science has become subordinated to Electrical
Engineering. I personally think that’s very very wrong.
I’d rather have an EE-CS department than a Math-CS department. I’d
rather have a CS department that recognizes that it, like IT, touches
almost everything else more, though.
-austin, has been through both styles before
Francis C. wrote:
computers becoming mainstream as only a temporary thing … a swing of a
pendulum to one extreme … general purpose SISD machines will be back!Are you expecting another round (or more) of massive improvements in
uniprocessor performance?
I’ll invoke Arthur C. Clarke’s laws: “When a distinguished but elderly
scientist says something is impossible, he is usually proven wrong. When
he says something is possible, he is usually proven right.” I don’t know
how distinguished I am – after all, I don’t even have a PhD – but I
think I have the elderly part down.
But seriously, there are plenty of projects going on to improve
uniprocessor performance at the hardware and technology level, and
there’s no doubt in my mind that one or more of them will pay off. In
addition, there is a massive existing body of knowledge on exploiting
parallel computing in the three domains where it is most needed –
large-scale numerical computing, multi-media processing and large
databases.
And finally, today’s PC is very much a parallel machine even before
you put a dual-core processor package in it. As I noted before, your
multi-media work is mostly done by specialized processors, with the CPU
being a control processor only. And when you look inside the chip,
you’ll find a RISC/microprogrammed/“LIW-like” architecture capable of
dealing with multiple copies of the i386 SISD base. Somehow all this
parallelism gets designed, built and debugged, and manufactured on a
massive scale.
So yes, I am not only expecting faster uniprocessors, I am also
expecting an evolution, not a revolution, in the programming
language area. And as an alumnus of the previous parallel and RISC
“revolution”, I very much resent statements like “Researchers have the
rare opportunity to re-invent these cornerstones of computing, provided
they simplify the efficient programming of highly parallel systems.” and
“We concluded that sneaking up on the problem of parallelism via
multicore solutions was likely to fail and we desperately need a new
solution for parallel hardware and software.”
Bluntly put, it’s a simple matter of economics. The people who genuinely
need massive parallelism already have it, and all that has changed is
that the cost in dollars and watts is coming down. And the people who
don’t need it – people who can do their jobs or live their lives
without a dozen 3 gigaflop general purpose CPUs and a programming
language to exploit them – are not going to pay for them.
–
M. Edward (Ed) Borasky, FBG, AB, PTA, PGS, MS, MNLP, NST, ACMC§
http://borasky-research.blogspot.com/
If God had meant for carrots to be eaten cooked, He would have given
rabbits fire.
On 11 Feb 2007, at 19:29, Charles Thornton wrote:
with Various Specialized Processors.
And Quantum computers on Tuesday this week, apparently:
http://www.techworld.com/opsys/news/index.cfm?newsID=7972&pagtype=all
Seeing as it makes use of a Tuneable Flux Transformer, I’m pretty
sure it’s going to work like a dream
Austin Z. wrote:
I’d rather have an EE-CS department than a Math-CS department. I’d
rather have a CS department that recognizes that it, like IT, touches
almost everything else more, though.-austin, has been through both styles before
Well … I’d rather have a math department, with both theoretical and
applied branches, a computer science department, a software engineering
department and an electrical engineering department. What I have a
problem with is one of these disciplines subordinating the others. But
it’s surely not a coincidence that UC Berkeley is near Silicon Valley.
Is there an Applied Mathematics Valley somewhere?
–
M. Edward (Ed) Borasky, FBG, AB, PTA, PGS, MS, MNLP, NST, ACMC§
http://borasky-research.blogspot.com/
If God had meant for carrots to be eaten cooked, He would have given
rabbits fire.
On 11 Feb 2007, at 20:09, M. Edward (Ed) Borasky wrote:
I’ll invoke Arthur C. Clarke’s laws: “When a distinguished but
elderly scientist says something is impossible, he is usually
proven wrong. When he says something is possible, he is usually
proven right.” I don’t know how distinguished I am – after all, I
don’t even have a PhD – but I think I have the elderly part down.
Quite so
Having not even read the piece…
I was following up on Software Transactional Memory from an earlier
Ruby T. posting, and that looks extremely promising. I’ve got this
thought that programming languages might change quite a lot in the
concepts they provide and emphasis. Perhaps when that happens, not
only will it get much easier to program, but parallel solutions will
no longer be hard. In fact, they might even be easier for a lot of
situations.
Cheers,
Benjohn
Benjohn B. wrote:
However, Most Likely -- General Multi-core Processors
with Various Specialized Processors.
And Quantum computers on Tuesday this week, apparently:
http://www.techworld.com/opsys/news/index.cfm?newsID=7972&pagtype=all
Seeing as it makes use of a Tuneable Flux Transformer, I’m pretty sure
it’s going to work like a dream
Yeah … anybody remember gallium arsenide? Liquid nitrogen cooled CMOS
and the GF-10? Occam? People who said it was impossible to build a
vectorizing C compiler? Lisp machines?
Speaking of the “need” for faster computers, I keep thinking of the
story of the two hunters who suddenly discovered they were being chased
by a bear. One of them sat down and started to change into his running
shoes. The other one said, “I don’t care what kind of shoes you have,
you aren’t going to be able to outrun that bear.” To which the first
replied, “It’s not the bear I have to outrun.”
–
M. Edward (Ed) Borasky, FBG, AB, PTA, PGS, MS, MNLP, NST, ACMC§
http://borasky-research.blogspot.com/
If God had meant for carrots to be eaten cooked, He would have given
rabbits fire.
“M. Edward (Ed) Borasky” [email protected] writes:
M. Edward (Ed) Borasky wrote:
[snip]
One other little piece of flame bait
the fact that Berkeley has a
combined EE-CS department that produced this paper is another symptom of
what’s wrong. Computer Science has become subordinated to Electrical
Engineering. I personally think that’s very very wrong.
I’d rather that than what has happened to my old department, which has
become
subordinate to IT! Instead of interesting courses on algorithms/data
structure,s parallel programming, computability, computer architecture,
graphics and game programming, logic, non-procedural languages etc, its
now all
web page design, java programming, and waffeling pseudo computer science
which
is more closely akin to science fiction.
With respect to your other post and comments on that article. while I
don’t
disagree with your analysis as it relates to right now and the immediate
future, I do think we will see a growth in parallel systems. Therre are
already
reports that the rate of increase in computing power is starting to slow
down.
I think once we get closer to the limit and we are not seeing the rapid
increases we have seen for the past 20 years, there will be a growth in
such
technologies as we push for more and faster processing power.
Parallel programming and architectures was one of my favorite courses
when I
did my degree. I remember spending ages writing a C program which ran on
a
special (extremely expensive) card for a PC that was sort of like a poor
mans
hypercube. It was a horrible environment (OS wise - something like DOS
1!, but
it was a lot of fun. Actually, I upset my lecturer as I somehow managed
to fry
the board (I think it was on its way out, I was just unlucky enough
to be the sucker using it at the time!). The program was an
implementation of
the dineing philosophers problem. I’d say it is still one of the more
interesting and truely challenging little projects I’ve worked on.
Tim
On Feb 11, 2007, at 10:20 AM, M. Edward (Ed) Borasky wrote:
one extreme … general purpose SISD machines will be back!
Let me rephrase that… general purpose computers always beat
special-purpose computers. I’ve seen LISP machines, database
machines, even full-text-search machines come and go. Right now,
it’s hard to be a single-core server chip. They’re all still very
general-purpose, it’s just that silicon builders, at the moment,
can take advantage of Moore’s law to build wider (multicore) CPUs,
but have failed to use it to make any one thread faster; with the
occasional exception of the IBM Power chips.
So I think the problem is real. [Disclosure: I work for Sun, maker
of the 8-core/32-thread T1, and there are more where that came from].
-Tim