Wenn nicht anders angegeben, findet das Seminar immer Montags um 13:15 im Raum IC 4 / 39-41 statt. Im Sommersemester 2006 wurde es von Michael Psarros organisiert.

20. März 2006 Wilhelm Dolle Aktuelle Methoden und Werkzeuge zum Aufspüren von Windows-Rootkits
5. April 2006 Benedikt Gierlichs Signal Theoretical Methods in Differential Side Channel Analysis
10. April 2006 Prof. Dr. Virgil D. Gligor On the Evolution of Adversary Models in Security Protocols - from the Beginning to Sensor Networks
21. April 2006 Prof. Dr. Clark Thomborson Trusted Computing: Open, Closed, or Both?
24. April 2006 Melanie Volkamer Secure Dynamic Web Services Composition
4. Mai 2006 Dr. Emmanuel Bresson On security model for group key exchange protocols
8. Mai 2006 Dr. Bodo Möller Provably Secure Password-Based Authentication in TLS
15. Mai 2006 Tim Güneysu Efficient Hardware Architectures for Solving the Discrete Logarithm Problem on Elliptic Curves
22. Mai 2006 Marius Mertens SMITH - A Parallel Hardware Architecture for fast Gaussian Elimination over GF(2)
29. Mai 2006 Hanno Langweg Sicherheit von Client-Applikationen für die elektronische Signatur
8. Juni 2006 Prof. Dr. Roberto Avanzi Delaying and Merging Operations in Scalar Multiplication: Applications to Curve-Based Cryptosystems
12. Juni 2006 Dr. Bodo Möller TLS mit elliptischen Kurven
19. Juni 2006 Dr. Sigrid Gürgens Ein Definitionsrahmen für Sicherheitseigenschaften
26. Juni 2006 Dr. Marian Margraf Der europäische ePass
3. Juli 2006 Prof. Dr. Roberto Avanzi On Redundant ?-adic Expansions, the Characterization of Non-Adjacent Digit Sets and Their Applications
4. Sept. 2006 Prof. Dr. Clemens Martin Managing IT Security - a Quantifiable Endeavour?
18. Sept. 2006 Prof. Dr. Berk Sunar Non-linear Residue Codes for Robust Public-Key Arithmetic


Wilhelm Dolle, interActive Systems Berlin:
Aktuelle Methoden und Werkzeuge zum Aufspüren von Windows-Rootkits

Die meisten Viren, Würmer und trojanischen Pferde lassen sich mit gängigen Antiviren-Programmen leicht entdecken. In letzter Zeit hat allerdings die Zahl an Rootkits zugenommen, die sich durch diverse Manipulationen am System vor der Erkennung durch solche herkömmlichen Scanner entziehen.

Der Vortrag wird auf neue Verfahren eingehen, mit denen Rootkits unter anderem Dateien, Verzeichnisse, Prozesse und Registry-Einträge verstecken können und Methoden sowie Werkzeuge vorstellen dies aufzudecken. Einer der neuen Ansätze, unter Windows, zur Entdeckung solcher Rootkits ist die so genannte "Cross-View based Rootkit Detection". Dabei werden zwei Zustände miteinander verglichen: der Blick auf Dateien, Prozesse und Registry- Einträge durch die Windows-API und der Blick auf dieselben Daten durch Auswertung des direkten Low-Level-Zugriff auf Kernel-Datenstrukturen bzw. Dateisysteme. Durch diesen Vergleich können Anomalien aufgedeckt und Hinweise auf Rootkits gefunden werden.

Wilhelm Dolle, CISA, CISSP, und vom BSI lizensierter IT-Grundschutz Auditor beschäftigt sich seit den 80er Jahren mit Netzwerken und deren Sicherheit. Seit 1999 ist er für die interActive Systems GmbH (iAS) tätig und dort in der Geschäftsleitung für den Bereich Information Technology und IT Security zuständig. Wilhelm Dolle hält regelmäßig Vorträge zu IT- bzw. Security-Themen und hat eine Vielzahl an Publikationen in diesem Gebiet veröffentlicht. Daneben arbeitet er als Gutachter bei verschiedenen Verlagen und unterrichtet an einer Berufsakademie das Fach Netzwerksicherheit.

Benedikt Gierlichs, COSY, RUB:
Signal Theoretical Methods in Differential Side Channel Analysis

This work presents the application of Signal Theoretical Methods to Side Channel leakage of Cryptographic Devices, particularly with regard to power consumption and electromagnetic radiation. We profoundly analyse the Template Attack and the Stochastic Model and compare their efficiencies in various parameter settings. Finally, we suggest and verify improvements of both attacks which yield success probabilities increased by a factor of up to five.

Prof. Dr. Virgil D. Gligor, University of Maryland, USA:
On the Evolution of Adversary Models in Security Protocols - from the Beginning to Sensor Networks

Invariably, new technologies introduce new vulnerabilities which, in principle, enable new attacks by increasingly potent adversaries. Yet new systems are more adept at handling well-known attacks by old adversaries than anticipating new ones. Our adversary models seem to be perpetually out of date: often they do not capture adversary attacks enabled by new vulnerabilities and sometimes address attacks rendered impractical by new technologies.

In this talk, I provide a brief overview of adversary models beginning with those required by program and data sharing technologies, continuing with those required by computer communication and networking technologies, and ending with those required by mobile ad-hoc and sensor network technologies. I argue that mobile ad-hoc and sensor networks require new adversary models (e.g., different from those of Dolev-Yao and Byzantine adversaries). I illustrate this with adversaries that attack perfectly sensible and otherwise correct protocols of mobile ad-hoc and sensor networks. These attacks cannot be countered with traditional security protocols as they require emergent security properties.

Biographical Note:
Virgil D. Gligor received his B.Sc., M.Sc., and Ph.D. degrees from the University of California at Berkeley. He has been at the University of Maryland since 1976, and is currently a Professor of Electrical and Computer Engineering. He is an Editorial Board member of the ACM Transactions on Information System Security, IEEE Transactions on Dependable and Secure Computing, and IEEE Transactions on Computers. Over the past three decades, his research interests ranged from access control mechanisms, penetration analysis, and denial-of-service protection to cryptographic protocols and applied cryptography. Recently, he was awarded the 2006 National Information Security Award by NIST and NSA in the US for his contributions to information security research.

Prof. Dr. Clark Thomborson, University of Auckland, New Zealand:
Trusted Computing: Open, Closed, or Both?

How might a next-generation computer system help us to decide, remember, and change our minds about whom and what we trust, to do what, under which circumstances?

Our question has been partly answered by a bewildering array of competing architectures, including SELinux and Microsoft's Trustworthy Computing Initiative. In this seminar I attempt to shed some light on the architectural competition.

My analysis is based on an examination of three use cases (email, business-to-business e-commerce, and digital rights management), with respect to three types of trust relationships: hierarchical trust, peer trust, and bridging trust. Open-source and closed-source design methodologies have differing strengths and weaknesses in supporting each of these three relationship types, suggesting that the most successful trusted computer systems of the future will be designed by a hybrid methodology.

Biographical note:
Dr Clark Thomborson has served as a Professor of Computer Science at the University of Auckland, New Zealand, since 1996. His prior academic positions were at the University of Minnesota, and at the University of California at Berkeley, with consultancies or temporary positions at MIT, Microsoft Research (Redmond), InterTrust, IBM Yorktown, IBM Almaden, Institute for Technical Cybernetics (Slovakia), and Xerox PARC.

He gained several years of commercial experience in the USA as a systems integrator at Digital Biometrics Inc (now Identix), LaserMaster Inc, and Nicolet Instrument Corp (now Thermo Electron).

Under his birth name Clark Thompson, he was awarded a PhD in Computer Science from C-MU and a BS (Honors) in Chemistry from Stanford. He has published more than 90 refereed papers on topics in software security, computer systems performance analysis, VLSI algorithms, data compression, and connection networks.

Melanie Volkamer, DFKI Saarbrücken:
Secure Dynamic Web Services Composition

Today, web services are a powerful concept for providing distributed electronic services. They allow the dynamic composition of complex services out of a library of individual services. Functional composition of complex services is well understood and supported.

However, applying dynamic web service composition in practice requires appropriate security facilities to guarantee the security requirements of all participants. On the one hand, web services have to be protected against misuse of their resources, and on the other hand, the customers of web services require the privacy of their data. While there are standard approaches for secure execution of web services which are based on access control, the SCALLOPS project focuses on privacy.

We described the security policies for the customers and security specifications of the web services and we formalized how web services have to deal with provided customer’s and newly created data. Our methodology is based on type-based information flow control.

Dr. Emmanuel Bresson, Cryptology department, CELAR Technology Center, France:
On security model for group key exchange protocols

A group key exchange protocol enable a pool of participants, communicating over a public network, to securely establish a common session key, in such a way that nobody outside the pool get information about it. While key exhange is a central primitive in cryptography, the formalization of the underlying security notions, as well as those required in the case of group key exchange, has been achieved only quite recently.

In this talk, we describe the formal security model we have proposed in 2001 (joint work with O. Chevassut and D. Pointcheval) in order to analyze existing protocols, for which only heuristic security arguments had been given. We will discuss in details the extensions and refinements that can be made to this model: formalizing authentication between the players, dealing with concurrent executions, corruption and forward-secrecy, dynamicity of group membership, etc. We will also compare the model with other approaches that have been developped by Bellare, Canetti et al., and we will briefly discuss the possible variants of the model, in particular how to enhance it in order to deal with password-based scenarios.

Dr. Bodo Möller, COSY, RUB:
Provably Secure Password-Based Authentication in TLS

Wie läßt sich speziell für SSL und TLS (Transport Layer Security) ein effizientes und beweisbar sicheres Schlüsselaustauschverfahren mit Authentisierung auf Basis von Paßwörtern gestalten? Dieser Vortrag stellt ein geeignetes Verfahren vor: Das "SOKE"-Verfahren (Simple Open Key Exchange) ist eine Variante des Diffie-Hellman-Schlüsselaustauschs, bei der der öffentliche Anteil des Client durch eine einfache Maske verschlüsselt wird. Im Vergleich mit früheren Vorschlägen für paßwortbasierte Authentisierung in TLS ergeben sich Vorteile bei der Sicherheit, Effizienz und bei der Flexibilität des Protokolls.

(Aus einer Kooperation mit Michel Abdalla, Emmanuel Bresson, Olivier Chevassut und David Pointcheval.)

Tim Güneysu, COSY, RUB:
Efficient Hardware Architectures for Solving the Discrete Logarithm Problem on Elliptic Curves

The utilization of Elliptic Curves (EC) in cryptography is very promising due to their resistance against powerful index-calculus attacks. Since their invention in the mid 1980s, Elliptic Curve Cryptosystems (ECC) have become an alternative to common Public Key (PK) cryptosystems such as RSA. With a significantly smaller bit size, ECC provides similar security than other PK systems (e.g. RSA). The effort of breaking a cryptosystem mainly defines its security. Hence, a ”secure” cryptosystem will most likely not be broken within the next decades even if we take technological progress into account. As a consequence, conventional attacks based on software implementations of cryptanalytical algorithms will most probably never succeed in breaking actual ciphers. It is widely accepted, that the only feasible way to attack such cryptosystems is the application of dedicated hardware.

In times of improved hardware manufacturing and increasing computational power, the issue arises how secure the small key lengths of ECC are, facing a massively parallel attack based on special-purpose hardware. This is the first work presenting an architecture and an FPGA implementation of an attack on ECC. We present an FPGA based multi-processing hardware architecture for the Pollard-Rho method for EC over GF(p) which is, to our current knowledge, believed to be the most efficient attack against ECC. The implementation is running on a conventional low-cost FPGA as it can be found, e.g., in the parallel code breaker machine COPACOBANA. The latter provides a parallel cluster of FPGAs, providing a large quantity of computational power.

Furthermore, we will project the results on actual ECC key lengths (e.g. k = 160 bit) and estimate the expected runtimes for a successful attack. Since FPGA-based attacks are out of reach for such key lengths, we present estimates for an ASIC design. As a result, ECC over GF(p) and bit sizes of k > 160 can be considered to be infeasible to break with current algorithms as well as computational and financial resources.

Marius Mertens, COSY, RUB:
SMITH - A Parallel Hardware Architecture for fast Gaussian Elimination over GF(2)

This talk presents a hardware-optimized variant of the well-known Gaussian elimination over GF(2) and its highly efficient implementation. The proposed hardware architecture can solve any regular and (uniquely solvable) over- determined linear system of equations (LSE) and is not limited to matrices of a certain structure. Besides solving LSEs, the architecture at hand can also accomplish the related problem of matrix inversion extremely fast.

As proof-of-concept the architecture has been realized on a contemporary low-cost FPGA. The implementation for a 50 × 50 LSE can be clocked with a frequency of up to 300 MHz and computes the solution in 0.33us on average.

In addition, the physical requirements for larger implementations are discussed with special focus on the limitations imposed by current CMOS technology and how they could possibly be overcome using optical components.

Hanno Langweg, Gjøvik University College, Norwegen:
Sicherheit von Client-Applikationen für die elektronische Signatur

Elektronische Signaturen auf Smartcardbasis gelten als eine sichere Methode, die Authentizität von Willenserklärungen zu garantieren. Wir betrachten aktuelle Software zur Signaturerstellung im Hinblick auf ihre Manipulationssicherheit. Es zeigt sich, dass die meisten Produkte selbst bei einem restriktiven Angreifermodell gegenüber Angriffen nur geringen Schutz bieten. Es ist u.a. möglich, andere als die angezeigten Daten zu signieren, den Benutzer zu Fehlbedienungen zu verleiten, und die Anzeige von Signaturprüfergebnissen zu verfälschen. Die als Lösung verbreiteten Klasse-2-Kartenterminals mit sicherer PIN-Eingabe bieten für sich genommen nur einen geringen Sicherheitsgewinn.

Prof. Dr. Roberto Avanzi, CITS, RUB:
Delaying and Merging Operations in Scalar Multiplication: Applications to Curve-Based Cryptosystems

In this presentation we introduce scalar multiplication algorithms for several classes of elliptic and hyperelliptic curves using different types of operations beside the group addition: doubling, halving and Frobenius operation. The methods are variations on Yao's scalar multiplication algorithm that allow an intrinsic parallelism of operations to become apparent. We can thus merge several group operations and reduce the number of field operations by means of Montgomery's trick. The results are that scalar multiplication on elliptic curves in even characteristic based on point halving can be improved by about 10% and the performance of Koblitz curves can be improved by at least 20%.

Dr. Bodo Möller, COSY, RUB:
TLS mit elliptischen Kurven

Kryptographie mit elliptischen Kurven wurde schon 1985 vorgeschlagen. Die seitdem spezifizierten Verfahren haben inzwischen auch in staatliche und zwischenstaatliche Normen Einzug gehalten. Eine weite Verbreitung in der Praxis hat Kryptographie mit elliptischen Kurven bis jetzt aber nicht: Wer mit einem Webbrowser über eine verschlüsselte SSL/TLS-Verbindung auf einen Server zugreift ("HTTPS"), verläßt sich dabei meist ausschließlich auf RSA als Public-Key-Kryptoverfahren - und das auf eine Weise, die strengen kryptographischen Anforderungen nicht gerecht wird.

Das wird sich bald ändern: TLS mit elliptischen Kurven ist im neuen RFC 4492 (N. Bolyard, V. Gupta, C. Hawk, B. Möller) spezifiziert und wurde bereits von vielen Software-Herstellern implementiert; mit Windows Vista und neuen Versionen von OpenSSL und vom Mozilla-Browser Firefox kann die Verwendung von elliptischen Kurven bald Alltag werden. Der Vortrag zeigt die Hintergründe.

Dr. Sigrid Gürgens, Fraunhofer SIT, Darmstadt:
Ein Definitionsrahmen für Sicherheitseigenschaften

Es ist allgemein akzeptiert, dass Sicherheitsaspekte fuer mobile Kommunikationssysteme und andere Typen von verteilten Systemen eine wichtige Rolle spielen, und dass Sicherheit in allen Phasen der Systementwicklung berücksichtigt werden muss. Es gibt eine grosse Bandbreite von Ansätzen zur Spezifikation von Sicherheitsanforderungen, die jedoch immer nur einen Ausschnitt der für ein System gewünschten Eigenschaften behandeln können. Im Vortrag wird ein Ansatz zur Spezifikation von Sicherheitseigenschaften vorgestellt, der es erlaubt, für ein einziges diskretes Systemmodell eine grosse Bandbreite von Sicherheitseigenschaften zu spezifizieren . Eigenschaftenerhaltende Homomorphismen ermöglichen den Transport der Eigenschaften von höheren zu niedrigeren Abstraktionsniveaus des Systems und unterstützen damit die Systementwicklung.

Dr. Marian Margraf, BSI, Bonn:
Der europäische ePass

Der Vortragende wird zunächst kurz auf die Hintergründe, die zur Einführung der neuen Reisepässe mit biometrischen Merkmalen geführt haben, eingehen. Hauptpunkte des Vortrags sind danach die folgenden drei in den europäischen Reisepässen benutzten Protokolle
- Basic Access Control (Ziele: Schutz vor unbefugtem Auslesen, Verschlüsselung der Verbindung)
- Chip Authentication (Ziele: Nachweis, dass der RF-Chip im Reisepass nicht geklont ist, Etablieren einer starken Verschlüsselung)
- Terminal Authentication (Ziel: Insepektionssysteme duerfen nur auf sensible Daten, wie z.B. dem Fingerabdruck zugreifen, wenn sie ein Zertifikat des Landes des Reisepassinhabers besitzen.)

Prof. Dr. Roberto Avanzi, CITS, RUB:
On Redundant ?-adic Expansions, the Characterization of Non-Adjacent Digit Sets and Their Applications

We discuss ?-adic expansions of scalars, which are important in the design of scalar multiplication algorithms for Koblitz Curves, but are also less understood than their binary counterparts.

At Crypto '97 Solinas introduced the width-w ?-adic non-adjacent form for use with Koblitz curves.  It is an expansion of integers z = ?i=0..l zi?i, where ? is a quadratic integer depending on the curve, such that zi?0 implies zw+i-1 = ... = z_{i+1} = 0, like the sliding window binary recodings of integers. We show that the digit sets described by Solinas, formed by elements of minimal norm in their residue classes, are uniquely determined. We also show that, unlike for binary representations, syntactic constraints do not necessarily imply minimality of weight.

Digit sets that permit recoding of all inputs are characterized. Two new digit sets are introduced with useful properties; one set makes precomputations easier, the second set is suitable for low-memory applications, generalising an approach started by Avanzi, Ciet, and Sica at PKC 2004 and continued by several authors since, including Avanzi, Heuberger and Prodinger, as well as Okeya, Takagi and Vuillaume. Results by Solinas, and by Blake, Murty, and Xu are generalized.

Termination, optimality, and cryptographic applications are considered. The most important application is the ability to perform arbitrary windowed scalar multiplication on Koblitz curves without storing any precomputations first, thus reducing memory storage to just one or two points and the scalar itself.

The speaker will also spend a couple of words about the interaction of these results with those of his previous talk in seminar.

Prof. Dr. Clemens Martin, University of Ontario, Canada:
Managing IT Security - a Quantifiable Endeavour?

Measuring performance of information security becomes an increasing need and an important tool for management and decision makers in any organization, as attacks and accompanying financial losses become more and more significant. Nevertheless, spending on information security programs are under increased scrutiny for Returns on Investments. From a business perspective it can be stated, that - as for many other business processes - it holds true for IT security: "If you cannot measure it, you cannot control it and if you cannot control it, you cannot improve it".

Metrics are important tools to measure Information security performance for many reasons. Determining the security posture at any given time goes beyond today’s practice of regular security assessments. The goal is to be able to answer the question "How secure am I?" at any given point of time. Measuring security is one component of ensuring compliance with new laws and regulations. This is an increasing area of interest for companies in the North America and Europe, particularly in the light of corporate financial scandals like Enron and WorldCom. Auditors today want to know how the digital crown jewels are protected before they issue a clean bill for their clients. A second driving force is to improve accountability and to provide efficiencies in handling information security programs in organizations and thus ultimately improve security. Knowing where the organizations stands with respect to IT security, helps decision makers to determine the success - and the return on investment - on security investments as well as to direct future efforts.

Our research concentrates on addressing the above questions. We describe a model and method to build a performance measurement framework. We discuss how different types of security indicators can be determined and we describe difficulties with others.

As a second aspect of measuring IT security, we describe an approach on how Security Controls can be integrated in a well established and accepted Business Quality Framework. The European Foundation of Quality Management (EFQM) framework is a highly recognized business model that is employed by many European businesses to achieve Business Excellence. It is a documented approach that uses a number of metrics to determine the Total Quality Management (TQM) of an organization by assessing nine different criteria. Conversely, the US National Institute of Standards and Technology (NIST) has outlined 17 controls that are categorized into managerial, operational and technical controls that can deduce the security state of an organization. While both perspectives are equally important, they cannot comprehensively capture the success of the business in an isolated manner. Realistically, an organization that strives to excel and gain the competitive edge over its competitors while at the same time pleasing shareholders and clients should encompass some quality standards that are based on a holistic management concept. This is what the EFQM strives to instill but it is limited from a security standpoint. Security is a growing concern and must be addressed as a quality issue in order to comply with legal stipulations, social and ethical obligations and productivity goals which in turn must reflect the confidentiality, integrity and availability principles. Hence, we propose that these two perspectives be merged into a framework that addresses the security and business excellence ideals and is truly reflective of the direction that an organization is heading in terms of profitability and long term sustainability in a very security-conscious world.

In this presentation we present an overview of where we currently stand with this research program, and what where we are going to work on in the future.

Prof. Dr. Berk Sunar, Worcester Polytechnic Institute, USA:
Non-linear Residue Codes for Robust Public-Key Arithmetic

An active side channel attack such as differential fault analysis (DFA) relies on the manifestation of injected faults as erroneous results which can then be observed at the output of the device. Apart from Bellcore style attacks there exists another type of fault attack, which is aimed at common countermeasures to passive attacks. In order to prevent power and electro-magnetical analysis techniques, many VLSI implementations nowadays employ power balanced logic gate libraries, whose power consumption and hence electro-magnetic emanations are data-independent. New fault attacks are aimed at introducing glitches into the circuit which cause such gates to "loose balance", i.e. reveal data through power imbalances. This opens the door to various classical attacks on the circuit, like simple and differential power (SPA, DPA) and electromagnetic (SEMA, DEMA) analysis. All this demonstrates the urgent need for a truly robust error detection scheme.

In this talk we present a scheme for robust multi-precision arithmetic over the positive integers, protected by a novel family of non-linear arithmetic residue codes. These codes have a very high probability of detecting arbitrary errors of any weight. Our scheme lends itself well for straightforward implementation of standard modular multiplication techniques, i.e. Montgomery or Barrett Multiplication, secure against active fault injection attacks. Due to the non-linearity of the code the probability of detecting an error does not only depend on the error pattern, but also on the data. Since the latter is not usually known to the adversary a priori, a successful injection of an undetected error is highly unlikely. We outline a proof of the robustness of these codes by providing an upper bound on the number of undetectable errors.

Our codes are attractive due to their data dependent and asymptotically low probability of missing errors. These properties make it nearly impossible for an adversary to successfully inject faults that are missed by the error detection network. Only if the attacker has the capacity to read out the live state of the circuit and instantly compute an undetectable error vector the attack will be successful.