Call for Papers

                            RV'02
                      Second Workshop on
                     Runtime Verification
                http://ase.arc.nasa.gov/rv2002

                  *************************
                  *** Extended deadline ***
                  ***      May 25       ***
                  *************************

                        26 July, 2002
                     Copenhagen, Denmark
 
                   Affiliated with CAV'02
                 http://floc02.diku.dk/CAV



OBJECTIVES

The objective of RV'02 is to bring scientists from both academia 
and industry together to debate on how to monitor, analyze and 
guide the execution of programs. The ultimate longer term goal 
is to investigate whether the use of lightweight formal methods 
applied during the execution of programs is a viable complement 
to the current heavyweight methods proving programs correct always 
before their execution, such as model checking and theorem proving. 
Dynamic program monitoring and analysis can occur during testing or 
during operation. The subject covers several technical fields as 
outlined below. 

Dynamic Program Analysis. Techniques that gather information during 
program execution and use it to conclude properties about the program, 
either during test or in operation. Algorithms for detecting 
multi-threading errors in execution traces, such as deadlocks and data 
races.

Specification Languages and Logics. Formal methods scientists have 
investigated logics and developed technologies that are suitable 
for model checking and theorem proving, but monitoring can reveal 
new observation-based foundational logics.

Program Instrumentation. Techniques for instrumenting programs, at 
the source code or object code/byte code level, to emit relevant 
events to an observer.

Program Guidance. Techniques for guiding the behavior of a program 
once its specification is violated. This ranges from standard exceptions 
to advanced planning. Guidance can also be used during testing to expose 
errors. 

Both foundational and practical aspects or dynamic monitoring are 
encouraged. 


INVITED SPEAKER

Professor Insup Lee
Department of Computer and Information Science 
University of Pennsylvania, USA
http://www.cis.upenn.edu/~lee

Insup Lee has been involved in the MAC 
runtime verification project: 

  http://www.cis.upenn.edu/~rtg/mac 

and is currently working on test-case generation.


SUBMISSIONS

The full submission should be sent by ** May 25 **.

Submissions should be up to 20 pages, describing recent  work,  
work-in-progress, and even highly speculative work  on all
aspects of dynamic program monitoring and analysis.

Topics of interest include, but are not limited to, the  following:

- Specification languages and logics for program monitoring. This
  includes real-time logics and automata.
- Predictive analysis: from one execution trace predicting 
  possible errors in other traces.
- Event extraction: how to instrument source code or object code
  to emit events during execution to an observer.
- Tracing and dynamic analysis of concurrent/distributed systems,
  including multi-threading analysis, such as deadlock and data race detection.
- Program behavior correction on-the-fly, based on violation of 
  a specification during program execution.
- Program execution guidance to expose errors.
- Synergy with other program analysis techniques such as testing, 
  model checking and static analysis.

Abstracts and submissions should be sent to one of the organizers.

Accepted  papers  will  be  published  in Electronic  Notes  in  Theoretical
Computer Science:  http://www.elsevier.nl/locate/entcs. Selected papers will
be published in the journal Formal Methods In System Design, Kluwer Academic
Publishers: http://www.kluweronline.com/issn/0925-9856. 


DATES:

  Submissions:   May  25, 2002
  Notification:  June 20, 2002
  Final papers:  June 30, 2002
  Workshop:      July 26, 2002


WEBSITE:

  http://ase.arc.nasa.gov/rv2002


PROGRAM COMMITTEE:

  Saddek Bensalem   (VERIMAG Laboratory)
  Rance Cleaveland  (State University of New York at Stony Brook)
  Michael Ernst     (Massachusetts Institute of Technology)
  Patrice Godefroid (Bell Laboratories)
  Klaus Havelund    (NASA Ames Research Center - Kestrel Technology)
  Gerard Holzmann   (Bell Laboratories)
  Sampath Kannan    (University of Pennsylvania)
  Jim Larus         (Microsoft Research)
  Insup Lee         (University of Pennsylvania)
  Grigore Rosu      (NASA Ames Research Center - RIACS)
  John Rushby       (SRI International)
  Joseph Sifakis    (VERIMAG Laboratory)
  Reid Simmons      (Carnegie Mellon University)
  Henny Sipma       (Stanford University)
  Olog Sokolsky     (University of Pennsylvania)

ORGANIZING COMMITTEE:

  Klaus Havelund   (NASA Ames Research Center - Kestrel Technology)
  Grigore Rosu     (NASA Ames Research center - RIACS)