Thursday, October 25, 2007

Does Presenting Your Results Simply Damage Your Early Career Development?

Over the last couple of days, I have been peeking at Nicole Immorlica's enjoyable live report from FOCS 2007. This comment on a post of hers, amongst others, really got me thinking.

There is another factor going on here, which is that younger people often feel the need to "impress" the audience (both those in and outside their area) with how difficult their result is. For better or for worse, this is to some extent essential for grad students hoping to get faculty jobs, and assistant profs looking ahead to their tenure letters.

Once someone gets tenure, they can decide (depending on their personality) whether to keep up this ruse, or whether to aim to present results as simply as possible.

I agree that there is a general tendency amongst researchers at an early stage of their career to show off the hardness of their results, both in in their talks and in their papers. (I, for one, would write all of my early papers differently now.) However, I am not at all convinced that this is essential to foster one's early career, as claimed by the commentator.

When it comes to talks, as in other job-related things, I tend to be a follower of Paul Halmos. (See this compilation of his thoughts that recently appeared in the Notices of the AMS.)

....the purpose of a public lecture is to inform, but to do so in a manner that makes it possible for the audience to absorb the information. An attractive presentation with no content is worthless, to be sure, but a lump of indigestible information is worth no more.…

Less is more, said the great architect Mies van der Rohe, and if all lecturers remember that adage, all audiences would be both wiser and happier.

Have you ever disliked a lecture because it was too elementary? I am sure that there are people who would answer yes to that question, but not many. Every time I have asked the question, the person who answered said no, and then looked a little surprised at hearing the answer. A public lecture should be simple and elementary; it should not be complicated and technical.

I like to think that this is the message we should pass on to young researchers from the very beginning of their research careers. Building a reputation as a clear speaker, who can make difficult ideas and results accessible to her/his audience, can only help in one's career.

Isn't it going to be a huge bonus to be remembered as a person who gave a highly accessible and entertaining talk at FOCS/ICALP/LICS/SODA/STOC 2nnn? Why don't we tell our young colleagues that a talk at a generalist conference is essentially a public lecture? The result would be better talks at conferences, a happier audience, and an increased reputation for the speakers.

Wednesday, October 24, 2007

Call For Nominations: EATCS Award

The call for nominations for the EATCS Award 2008 has been published (pdf). The EATCS Award committee consists of Catuscia Palamidessi, David Peleg (chair), and Emo Welzl. Nominations should be sent to David Peleg by December 1, 2007. Further details on the award, including the stellar list of previous awardees, is here.

Be sure to nominate your favourite TCS researcher! The award will be delivered at ICALP'08 in Reykjavík.

Monday, October 22, 2007

Call For Nominations For The Gödel Prize 2008

The 2008 Gödel Prize for outstanding journal articles in the area of theoretical computer science, co-sponsored by EATCS and ACM SIGACT, will be awarded at ICALP 2008, which I am co-organizing in Reykjavík with Magnús M. Halldórsson and Anna Ingolfsdottir . The call for nominations, with deadline 31 January 2008, is now available here.

Make sure that you nominate your favourite paper in TCS for the award! One of the authors of the award-winning paper will attend the conference and will deliver a talk.

Sunday, October 14, 2007

Deadline Approaching: Call for Workshops for ICALP 2008

The deadline for workshop proposals for ICALP 2008 is 31 October 2007. See the call for workshops for details on how to submit a proposal.

Friday, October 12, 2007

NordSec 2007

Over the last couple of days, ICE-TCS and the School of Computer Science at Reykjavík University hosted NordSec 2007, the 12th Nordic Workshop on Secure IT-systems. The workshop featured invited talks by Cedric Fournet (Microsoft Research, Cambridge, UK) and Greg Morissett (Harvard University, USA). I attended a few talks at the workshop (in between visits to my office to take care of typical daily chores and make sure that my inbox did not overflow), and chaired a session with short presentations.

I found the event interesting and enjoyable. In particular, the two invited talks were excellent. Cedric Fournet presented some joint work with Andrew D. Gordon and Sergio Maffeis reported in the paper A Type Discipline for Authorization in Distributed Systems. In his talk, Cedric addressed the following two key questions.
  1. How can we express good authorization policies?
  2. How can we enforce good authorization policies?
Cedric's tenet in the talk was that logics are good languages for expressing policies, and that type systems can be used enforce good policies at compile time. He also showed the audience what can be guaranteed at compile time when parts of the system are compromised, and the role that "robust safety" (safety despite compromised principals) plays in reasoning about processes.

Greg Morissett delivered an inspiring talk on the Cyclone project. Cyclone is a safe dialect of C. It is designed so that pure Cyclone programs are not vulnerable to a wide class of bugs that plague C programs such as buffer overflows. (Greg said early on in his talk that the legacy of C is, if you allow me to phrase one of his slides as a regular expression, (Buffer overrun)* and that he hates C!) Greg's talk was a great ad for some of the topics I am about to cover in my course on semantics for programming languages, and highlighted how good time-honoured theory in new clothes can help improve on the safety of a beast like C.

I think that his talk gave each member of the audience something to take home, and that's one of the main secrets of a successful invited address. I just wish that more of my colleagues had been there to listen.

CONCUR 2009

Breaking news! I am pleased to announce that CONCUR 2009 will be held in Bologna, and will organized by Mario Bravetti and Gianluigi Zavattaro. This will be the second Italian CONCUR after Pisa 1996.

Congratulations to Bologna, which hosts one of the strongest CS departments in Italy and is a hotbed of concurrency-theoretic research, and to CONCUR for visiting a lovely city.

Monday, October 08, 2007

Prize Time

It is the time of the year when prize announcements are out fast and furious.

The ink has not yet dried on the news announcing that the 2007 Nobel prize for medicine has been awarded to Mario R. Capecchi, Sir Martin J. Evans and Oliver Smithies "for their discoveries of principles for introducing specific gene modifications in mice by the use of embryonic stem cells". (I notice that Mario Capecchi is yet another Italian expatriate who, like Fermi and Segre before him, has become a US citizen.)

In mathematics/CS, László Lovász has received the Bolyai Prize, and Ben Green, Hershel Smith Professor of Mathematics at Cambridge University (UK), will receive the 2007 SASTRA Ramanujan Prize "for his phenomenal contributions to number theory by himself and in collaboration with Terence Tao, that have changed the face of combinatorial additive number theory."

Tomorrow is the turn of the Nobel prize for physics. It'll be interesting to see who receives it. I hope for an award to an Italian in Italy :-)

Sunday, October 07, 2007

Frits Vaandrager's Answers

I have received a contribution from Frits Vaandrager to the discussion arising from the CONCUR workshop. I am very pleased to post them here.
Enjoy!


# What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?

When modelling embedded systems and/or protocols, real-time aspects have to be dealt with very frequently. If one prefers not to use a real-time model checker, one can often encode
real-time constraints in finite state model checkers. For instance, in an STTT paper from 2002, Brinksma, Mader and Fehnker analyzed a PLC control schedule using SPIN, and obtained results that were (at that time) competitive with Uppaal. Their trick was to advance the discrete clock variable not one by one, but directly to the point of the next event.

In his PhD thesis, Rajeev Alur has made a good case for dense time models. In the case of open systems with components that have different clocks, dense time is conceptually the right
thing to do. But in practice one often gets away with discrete-time models.

Industry has no preference for dense-time or discrete-time: any approach that solves their problems is fine.


# How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?

The maturity of model checkers has increased enormously over the last years and as a result it becomes much easier to apply them, also for non-experts. I give two examples:

- Allen B. Downey has written "The little book of semaphores", a nice book in which he presents solutions to dozens of concurrency problems, ranging from classical ones (like the barbershop)
to tricky and obscure problems (like the Sushi bar). This year I asked a class of first-year computer science students with just a few hours of model checking experience to pick a problem from Downey's book (a different problem for each pair of students) and to model and analyze Downey's solution for it using Uppaal. As a result, my students spotted several mistakes in Downey's book, mistakes which have been confirmed by the author.

- Matthijs Mekking, a student interested in implementing protocols and with no particular expertise in formal methods just completed an MSc thesis project at NLnet Labs on the SHIM6
protocol, a proposed IETF standard for multi-homing. At some point he started to model the protocol using Uppaal and became enthusiastic. He managed to find several nontrivial mistakes
in the standard and greatly impressed the protocol designers. His results directly influenced the new version of the standard. (See http://www.ita.cs.ru.nl/publications/papers/fvaan/SHIM6/)

SMEs usually don't have verification specialists in house, so they need some advice on modelling and analysis. But with todays technology it is possible to get results rather quickly, and they can do part of the work themselves. The situation will further improve when MSc and PhD graduates with a background in model checking get jobs at these companies. My group is collaborating with several SMEs and the verification problems they provide us with are often
interesting from an academic perspective.

Unlike SMEs large companies are able to invest in long-term research.

# Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?

Yes!! There is a great need for stochastic and probabilistic modelling and analysis techniques, and I would for instance welcome a tool that combines the functionality of Uppaal and PRISM.

The Zeroconf protocol, for instance, is full of probabilistic features that we could not model using Uppaal. If we really want to make impact in the area of wireless sensor networks, mobile ad-hoc networks, and P2P networks we need to extend model checkers with probabilities since the design of these networks can only be properly understood if we take probabilities into account.

# How can we, as a community, foster the building of industrial-strength tools based on sound theories?

Theory for the sake of theory is good and important but IMHO the concurrency community has too much of it. In order to really push model checking technology into industry the performance and functionality of these tools must be further improved by several orders of magnitude. This can only be achieved by a combined and focused effort of a large team of researchers and is also a major academic challenge.

In order to test new ideas one needs prototype tools to play with. However, I believe it is not healthy that almost every research group on model checking has its own tool. Only a few groups manage to keep their model checking tool in the air for more than a decade. Developing an
industrial-strength model checking tool requires a huge effort. I think academic groups have to join forces if they want to build (and maintain!) industrial-strength tools. Uppaal has been highly succesful in part due to the continued efforts from teams in Aalborg, Uppsala, Nijmegen, Twente and elsewhere. So why don't the CADP and muCRL teams join forces? Why isn't it possible to establish stronger ties between Uppaal and IF? Why are there so many probabilistic model
checkers?

By combining efforts we can be much more effective.

# What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?

Contributions are very powerful modeling languages and concepts, and of course model checking.

A major challenge ahead of us is to combine features. In model checking people proposed symmetry reduction, partial order reduction, CEGAR, etc. What about the combination of all these features? How can we prove it is sound. Can we combine probabilistic choice, hybrid aspects, real-time, and hierarchy as in state-charts? I believe it will be possible to define such combinations but I expect we will need theorem provers and proof assistents to help us to manage the resulting complexity.

I am aware of applications in other areas, but I believe computer system engineering will remain the most important application area for concurrency theory the coming decade.

Friday, October 05, 2007

Joost-Pieter Katoen On The Need for Probabilistic and Stochastic Modelling

I am posting this message on behalf of Joost-Pieter Katoen, who sent me his reaction to one of the questions posed to the panel members during our workshop at CONCUR 2007. Enjoy!

I'd like to answer to the question on the need for stochastic and probabilistic modeling (and analysis). Some concrete examples of case studies provided by industry for which probabilistic aspects are very important are listed below. The importance of explicitly modeling random effects explicitly stands or falls with the kind of property to be established, of course, so I am definitely not claiming that these examples cannot (and should not) be modeled by techniques that do not support random phenomena.

1. Leader election in IEEE 1394: in case of a conflict (two nodes
pretend to be a leader), the contending nodes send a message (be
my parent) and randomly wait either short or long. What is the
optimal policy to resolve the contention the fastest? (This turns
out to be a slightly unbiased coin).

2. In the Ametist EU-project, the German industrial partner Axxom
generated schedules for a lacquer production plant. While doing
so, they abstracted from many details that the lacquer producer
supplies such as: the average fraction of time a resource is not
operational, the fraction of (operational) time the resource can
be used because necessary human support is present, and so
forth. In their abstraction they scheduled tasks conservatively
and they were interested in whether they could improve their
schedules while reducing the probability to miss the deadline.
Clearly, a stochastic modeling is needed, and indeed has been
carried out using a stochastic process algebra.

3. Hubert and Holger should be able to say much more about
a recent project they are pursuing with a French company on
the validation of multiprocessor multi-threaded architectures.
I do not know exactly what they are investigating, but they use
stochastic process algebras to model!

Finally, let me say that (as Moshe is also indicating) that the
interest in probabilistic modeling is growing steadily. To give
an example, the European Space Agency (ESA) is currently
considering to use probabilistic modeling and analysis in the
context of AADL, an architecture specification language where
an important ingredient ais the failure rates of components.

All in all, it is fair to say that there is a quest for probabilistic
techniques!

Joost-Pieter Katoen

CNRS Bronze Medal To Concurrency Theorist

I learned last night that Patricia Bouyer (CNRS researcher at LSV, ENS Cachan) has received a CNRS bronze medal 2007 in section 7. The CNRS bronze medal is awarded for outstanding achievement by a junior researcher.

Congratulations to Patricia, with whom I was lucky enough to work on this paper. At that time, Patricia was a freshly minted MSc student, but her talent was already abundantly clear. I recall that we were trying to show that a property language was sufficiently expressive to describe all of the properties of timed automata that could be checked by reachability analysis in the context of a test process. Patricia constructed a test process which had no counterpart in the property language, and this led to a strengthening of the property language, which we eventually were able to prove expressively complete.

From those early days, Patricia has gone from strength to strength, as witnessed by her impressive output. I believe that there is much more to come from her.

Tuesday, October 02, 2007

Reverse Age Discrimination

Every now and then, I rant on this blog about Italian academia. I do this because of the strong ties I have with my country (or in typical Italian fashion I should say "well, actually, with my region Abruzzo" or even "with my home town, Pescara" :-)) , and because I am saddened by the frustration I sense in the Italian colleagues I meet at conferences or I exchange emails with.

The latest installment of the saga of Italian academia I read is the commentary Reverse Age Discrimination written for Nature Physics by Francesco Sylos Labini and Stefano Zapperi, two physicists based in Rome. Their piece paints a rather bleak picture for young Italian scientists and reports on what they call "Lo Tsnunami dell'Universita' Italiana" (The Tsunami of Italian University). I encourage you to read their opinions and to look at the statistics they report on here.

The figures are amazing. Italian universities have an unusually large fraction of old professors. In Italy, 41% of the university professors are aged 60 or older and only 4% are below 40. If we consider full professors only, we discover that overall in Italy more than 47% are aged 60 or older, but for physicists this percentage reaches a hefty 64%!

Add to this picture that Italian universities have a very irregular recruitment flow, with sudden bursts of hiring based on seniority rather than on merit, and you will see that the only option
left for several talented young scientists is to move to other countries. The commentary reports that this year the French National Research Council (CNRS) selected seven young researchers (CR2) in theoretical physics, out of whom four were Italians! If we look at CS, let me go on record once more as saying that Italians in Paris would form a very strong TCS department by any measure.

Let me close by quoting from the last paragraph in the article.

To conclude, we acknowledge that some older scientists are active and productive, and that European countries should do something to keep them engaged. .... The broad question of how to use the experience of older faculty without hindering the advancement of the younger generation remains an important challenge.

I hope that this is a challenge that Italy's academic establishment will face very soon. The future of Italian science depends on it.