Thinking Parallel

A Blog on Parallel Programming and Concurrency by Michael Suess

Parallel Programming is Entering the Mainstream – or isn’t it?

A StreamAlan Zeichick is convinced that parallel programming (threading in this case) is conquering the desktop. To measure how far this adaption goes in a specific organization, he proposes a Threading Maturity Model (ThMM). But I would not have formulated the headline of this article like I did, if I did not see question marks still and this article attempts to explain them a tiny bit…

But before I start with the question marks: I am also convinced that parallel programming in general and especially threading is going to be used more often, simply because the architectures today require it, as I have stated e.g. in my previous article about why I love parallel programming. I even went as far as to proclaim:

… we are in the middle of a revolution right now! It is a parallel revolution, and this time it is for real.

The last part of the sentence was added because people have basically been saying parallel programming is the next big thing since the very beginning of the computing age, and yet until these days parallel systems have remained expensive and not used much outside of computing centers. During my time studying for my diploma, a friend of mine bought a dual-processor Celeron system and proclaimed that soon every system on the market would be parallel. Did not happen. When the Pentium Pro chips first appeard, many people were so excited about their multiprocessing capabilities that they voiced the same opinion. Not that time, though. One probably does not need to look too far into the past to see similar examples of people going all crazy and proclaiming that parallel programming is all the rage and mainstream. Yet, it has not happened until today.

Therefore I was sceptical about this claim as well and have talked to many older and more experienced people from different fields at conferences about this issue, yet they all seem to agree: this time it’s different, the parallel architectures are here to stay and parallel programming is entering the mainstream. And that convinced me.

But of course, who am I to judge what is in the mainstream? I am sitting in my ivory tower (university, although I wish it was a tower. or ivory for that matter :P) all day, doing parallel programming. The group I am working in is at level 5 (Adoption) of Zeichick’s scale already. Yet, what counts is not me or my group. Its the vendors and ISVs out there that are building and selling actual applications and not merely prototypes and papers as we are taught in academia (which is, by the way, one of the main reasons I want to leave the university after my PhD at the end of this year). Are they adapting to parallel programming yet? Do they realize the full potential and also the dangers of the new computer architectures sold in every supermarket today?

The answer to this question I don’t know. I suspect not. I know we are not receiving requests for students trained in parallel programming often, but that may be because I am working at a relativly small and unknown university. When I am studying the job offerings here in Germany, I rarely see one where experience with threading or concurrent programming is asked for. That’s not a big problem for me, as I can always go back to sequential programming or being a project manager, but still a little voice in the back of my head says the situation should be different and companies should be adapting now, and not when it’s too late (see below for a description of too late).

Maybe the software shops don’t have any performance problems with their software. This is probably the best reason there is not to bother with parallel programming. There are reasons to use threads except for performance of course, but I think performance remains the most prominent one. But this state won’t last forever. More and more shops will run into performance issues with their software, because current architectures are not getting much faster, but rather more parallel. The game companies are (as always) the first ones to feel this, and as I understand it they are working on the problem (I know Valve does, but I guess Valve is not the typical software shop). But others will surely follow. If they start thinking about parallel programming by the time they realize their programs are too slow, it may be too late already.

Refactoring a big, existing codebase to use threads or some other parallel programming model is a huge undertaking (sometimes even comparable to a total rewrite). Testing it requires new tests, tools or sometimes even new testing frameworks (you know if your testing-framework is thread-safe? I bet it’s not!). It is not impossible, but the whole process takes a lot of time. Especially, when your developers need to be trained up front. If an ISV decides to do it when the first performance-problems appear, they may have a hard time getting a stable release out of the door in time. Or until their money runs out :?. If, on the other hand, the technical leadership of a software shop acknowledges the risk and decides to act in anticipation of the problem, a migration may be started early and will go more smoothly because there is more time to get everything right.

Developers can be trained ahead of time. In some parallel programming systems (e.g. OpenMP) there is an easy way to switch off parallel execution and ship a sequential version with one compiler switch. If you started parallelizing early enough and there are problems during the migration (and let’s be realistic: when was there a significant software development project without problems :|) you can still switch off parallelism and ship one more sequential version of your software to make your customers happy. If that version is too slow already because you have started too late you are out of luck.

But then again, I got side-tracked and this is just me musing in my ivory tower. And that’s why I would like to hear from you: Do you see concurrent programming being adopted in your workplace? Which stage on the Thread Maturety Model is your company on? Or maybe you feel that all this talk about parallel programming becoming mainstream is just another hype and will die down as soon as the CPU-manufacturers get their act together and start turning the clockspeed-screw again? Have you seen demand for parallel programming skills increase lately?

Questions and yet more questions. All of these are highly subjective, therefore I would really like to hear your opinion on all this!

5 Responses to Parallel Programming is Entering the Mainstream – or isn’t it? »»


Comments

  1. Comment by Michal Migurski | 2007/01/21 at 21:04:41

    My own experience has been an increased interest in parallel programming at the multi-CPU rather than multi-thread model. I’ve started to experiment in parallel, distributed methods after encountering problems that are too big to fit on one computer. I think this is a pattern among friends as well – no one seems to be feeling the horsepower pinch on individual processors, but there is definitely growing interest in big problem sets such as data mining, graphs, and so on. I continue to view threads as needlessly complex, and choose asynchronous, single-thread models such as Twisted over multi-thread ones. The interaction between machines is fascinating, though, and I find myself dealing with the kinds of problems raised by threads as I coordinate processes over network connections. Amazon’s EC2 service has stepped in to help fill this need and support experimentation cheaply.

  2. Comment by Joshua Volz | 2007/01/21 at 21:05:31

    I believe there is little or no customer demand for parallelism, particularly for the business software users. Their software is bottlenecked at the server (which is already running multiple processors and is already parallel) and a faster or more parallel desktop isn’t going to help them at all. This is true of both LAN desktop and web based applications. How much parallel processing do you need to run IE or Firefox when the limiting performance step is your internet connection, the web server (already parallel) or the database (already parallel)? I believe most business users fall into this category.

    I do agree that game companies are going to take advantage of parallelism to great effect. From that perspective, a normal end user might be interested in parallelism. There is a lot of performance gain to be had because graphics programming is largely a mathematical exercise. We haven’t heard about it in the gaming world (customers demanding parallelism) mostly because the customers would have to know what parallelism is before they could ask for it. Computer users are undereducated on this topic.

    I don’t believe parallelism is stillborn. I am just not convinced that it is going to change the entire nature of programming over the next 5 -10 years. As you mention it is an infrastructure change, requiring retraining programmers that are already in short supply. Just because we can do something (have the hardware), doesn’t mean we are going to.

  3. Comment by Haibing Shao | 2007/01/30 at 00:18:57

    I am in Germany now and also in the Ivory Tower. 🙂 My field is ground-system modeling software development, dealing with groundwater contamination, geothermal and geochemical processes. For us, parallel computing is badly needed! Well, but we are too scientific…

    Personal opinion: Most of Current software might not need parallel.
    This is the list of my frequently used software:

    *SKYPE –> yes. not fast enough. parallel might help(well, dual-core/ one thread solve the problem)
    *MSN Messenger –> fast enough, does not help
    *emule –> The bottle neck is the internet
    *bittorrent –> same as emule. next bottle neck is the hard disk
    *media player –> fast enough, does not help
    *VS .net –> bottle neck is the hard disk
    *Mozilla Firefox –> fast enough
    *Office suit –> fast enough
    *Anti virus –> not fast enough, but dual-core/one thread solves it

    Ok, so why we need parallel for people with a normal life? Maybe:
    *games!!! of course visualization and AI will be much faster and better!
    (very possible)
    *voice-driven softwares (if vs.net has a function that can recognize my voice and type in ” , ; -> . if () else for while true false int double …….. for me, I will be very very very happy:-)Heard that someone is developing that)

    My point is that: Parallel gives an opportunity to innovative ideas. So far there is not that much software on the market really which take that advantage. But the most prize is always for those astonishing ones.

  4. Comment by Vladimir Frolov | 2007/04/20 at 23:00:13

    In my opinion parallel programming should be massively used on desktop to become mainstream practice. On one hand multicore and hyperthread desktop processors force software on desktop to be parallel, on other hand it is expensive now to create parallel software even for server side not to speak of desktops. Parallel programming systems (OpenMP, MPI, POSIX Threads, .NET and Java threads etc) are very complex to use it on desktop for most of applications. Developers have to have a lot of qualification to use such systems. Thus I think parallel programming becomes mainstream only when easy enough parallel programming system will be invented. Moreover, this system has to do the same thing which garbage collection doing for memory management. It has to allow developers to create parallel programs without explicit manipulations with threads and with any kind of synchronization. And I hope such kind of systems will be invented extremely soon. I know at least one attempt to devise such system, it is stressflow. Stressflow is not good enough to use it as is, but it is one tiny step in right direction.

  5. Comment by David Blake | 2007/09/04 at 19:42:12

    Consumers and business customers probably don’t understand why it is difficult, and they wouldn’t understand it if your tried to explain it to them. How could they possibly know how to ask for it then? Customers do want faster software, but that’s the limit of their ability to express what they want–faster.

    We build and sell mainstream software that uses advanced concurrency, but our algorithms to date have dependencies that prevent us from doing parallel algorithms. We do have unreleased code that uses parallel algorithms. We still have managed to design the software from the ground up with the multi-core consumer desktop in mind.

    The big problem is that today’s university curriculum is moving away from teaching about pointers and other “complex” issues in favor of languages with runtimes that do garbage collection and other services. This school of thought creates graduates who don’t know how to debug real software, and who believe issues that you face in parallel programming should and will be handled by the runtime of the language flavor that is taught.

    If these C.S. departments had any vision, they would leave teaching .NET and Java to the Information Systems department in the business schools and keep the hardcore engineering topics in the engineering classes. Instead, we have a dilution of the quality that the engineering degree used to represent.


Leave a Reply

HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>