Version 52 (modified by karianne, 11 years ago)

--

Bios and Abstracts


Jim Maltby

Keynote speaker: Dr. James Maltby - YarcData

Dr. James Maltby is a Solution Architect for YarcData Inc., and specializes in mapping scientific and business applications to new computer architectures. He has an academic background in physics and engineering, specializing in radiation transport. He has worked for Cray since 2000, developing software for the massively multithreaded Cray XMT (and its MTA-1 predecessor) as a well as the other Cray systems. He also led the Bioinformatics practice at Cray for several years, using HPC to solve Life Science problems. His most recent project involved developing a highly parallel in-memory Semantic Database for the XMT architecture, now released as uRiKA by YarcData.


Ian Horrocks

Keynote speaker: Prof. Ian Horrocks - Oxford University

Ian Horrocks is a Professor in the Oxford University Department of Computer Science and is a Fellow of Oriel College Oxford. His research interests include logic-based knowledge representation and reasoning and the semantic web, with a particular focus on ontology languages and applications. He was one of the key authors of the OIL, DAML+OIL, and OWL ontology language standards, chaired the W3C working group that standardised OWL 2, and developed many of the algorithms, optimisation techniques and reasoning systems that underpin OWL applications. He is a Fellow of the Royal Society, a member of Academia Europaea, an ECCAI Fellow, and is amongst the most highly cited authors in Computer Science.


Maurizio Lenzerini

Prof. Maurizio Lenzerini - Università di Roma La Sapienza

Maurizio Lenzerini is a Full Professor in Computer Science at Università di Roma La Sapienza. He is conducting research in data management, knowledge representation and reasoning, information integration, and service-oriented computing. He is the author of more than 250 publications in international conferences and journals, and has been invited speaker in many international conferences. He was the Chair of the Executive Committee of the ACM Symposium of Principles of Database Systems in 2010-2012, and is currently a member of such Committee. He is a Fellow of the European Coordinating Committee for Artificial Intelligence (ECCAI), a Fellow of the Association for Computing Machinery (ACM), a member of the Academia Europaea - The Academy of Europe, and the recipient of several research awards, including an IBM University Shared Research Award, and an IBM Faculty Award.


Josef Noll

Prof. Josef Noll - University of Oslo

Josef Noll is professor at the University of Oslo in the area of Mobile Services. His group ConnectedLife concentrates on the working areas mobile-based trust and authentication, personalised and context-aware service provisioning, and service continuation in 5G systems. He is co-founder and steering board member of the Center for Wireless Innovation Norway and Mobile Monday Norway.

In the area of Internet of Things he is project leader of the JU Artemis pSHIELD project. Previously he was Senior Advisor at Telenor R&I in the Products and Markets group, Programme manager for the UMTS++ Mobile Broadband Access programme, and project leader of several European projects.

Prof. Noll received the IARIA fellow award in 2008. He is reviewer of several EU FP6/FP7/FP8, referee of the European Research Council (ERC) Starting Grant, and evaluator of the EU's framework programme FP7, the Joint Undertaking (JU) Artemis, the Dutch IOP, the Austrian FIT, the Qatar National Research Fund and the Cyprus research programmes.

Measurable Security for the Internet of Things

Business intelligence is moving towards the real-time handling of information, coming from both internal and external business processes. Inclusion of sensor data in automatic process control has been a topic in industry for quite a while, but was mainly limited to closed systems. Trends in collaborative industries like oil & gas show that sensor data might contribute to automatic processes in different domains, fostered by the vision of the Internet of Things.

The presentation will address the challenges of communication across domains, focussing on the challenges of new infrastructures, new ways of communication and new devices. Two main trends are visible: (i) wireless sensors contributing to automated processes and (ii) the move of control into mobile devices. The example of "bring your own device" (BYOD) is used to exemplify the trends of devices accessing processes and information in your enterprise. In the upcoming years not only phones, tablets and computers will demand access, but also sensors and embedded system will deliver and request information. Sensors will contribute to automated processes, and thus require a knowledge management.

Classic threats as insufficient authentication and loss of devices are addressed through an approach of integrating, managing and securing mobile devices. Such a short-sighted approach, as suggested by leading IT companies, is deemed to fail. A paradigm shift in handling security is required, addressing the need for securing information instead of securing infrastructure. The paradigm shift includes the need for measurable security, and addresses a metrics-based approach for a quantitative assessment of both the potential attack scenario and the security measures of the infrastructure.

Our suggested approach is based on the semantic description of both a potential attack scenario, the security-related aspects of my sensors/systems and semantic policies. The outcome is a methodology for measurable security, and provides composable security for sensor systems. The approach is currently applied in the areas of Railway Security and UAV operation through the European Artemis project nSHIELD (http://newSHIELD.eu)


Stephen Brobst

Stephen Brobst - Teradata Corporation

Stephen Brobst is the Chief Technology Officer for Teradata Corporation. Stephen performed his graduate work in Computer Science at the Massachusetts Institute of Technology where his Masters and PhD research focused on high-performance parallel processing. He also completed an MBA with joint course and thesis work at the Harvard Business School and the MIT Sloan School of Management.

Stephen has been on the faculty of The Data Warehousing Institute since 1996. He was also appointed to Barack Obama's Presidential Council of Advisors on Science and Technology (PCAST) in the working group on Networking and Information Technology Research and Development (NITRD).


Graham Moor

Graham Moor - BrightstarDB Ltd.

After graduating top of his class with a 1st Class Honours degree in Computer Science from Southampton University Graham worked at leading information, knowledge and content management companies (DPSL Ltd, STEP Gmbh, Empolis, Ontopia) before forming Networked Planet in 2004.

Graham has led teams of developers in building semantic web, knowledege and content management products for over 10 years. Graham's roles have included the development and productisation of leading edge technology as well as the evangelism and partner network building that is required to take these products to market.

Graham has also been heavily involved in the communities around information and knowledge management technology. He has spent 10 years on the ISO13250 Topic Maps committee, editing the Topic Maps and Topic Maps Constraint Language and being the author of the SDShare information integration protocol.

Introduction to SDShare

SDShare is a stream based protocol for exposing and consuming data in a standardised way. It uses Atom and RDF as building blocks to allow servers to expose collections of data, and clients to process and update local data stores with copies of the data.

RDF provides a very flexible data model that allows data of any form to be represented. The existing RDF family of standards define how to serialise the model, how to query the model (SPARQL) and how to update the model (SPARQL Update). However, there are no protocols that define how a machine can both expose and consume RDF in a system of interconnected nodes. A classic use case for this requirement is where one RDF server contains a collection of master data and another RDF system wishes to use that data. The master data will change over time. This protocol defines how the master data server can publish the changes to the resources it manages and how a client can consume them to ensure it is in sync.

While the standard uses RDF for interchange, source systems need not support RDF natively, but instead transform data as requested into RDF. This makes SDShare a solution to the general problem of enterprise information integration and compares favorably when compared to data integration via Web Services.

The protocol was original developed as a response to an eGovernment challenge of syndicating asset metadata between government agencies. Since then it has been refined and published as an open standard at SDShare.org. It has also been in use in many government and commercial applications. The main business uses for SDShare include, enterprise search of multiple sources, master data management, data quality management, content classification and unified data merging and querying.

This presentation will describe the technical aspects of the standard and well as provide high level overview of the most common use cases.


Torulf Mollestad

Torulf Mollestad - SAS

Torulf Mollestad, Ph.D., is a Senior Consultant of Advanced Analytics for the SAS Information Management team in the Nordic region; working on analytics, data mining and text mining projects for numerous industries and many different application areas, such as churn prediction, risk modeling, fraud detection, anti money laundering, industrial equipment failure prediction etc.

He was an Associate Professor NTNU (Norges Teknisk-Naturvitenskaplige Universitet) in Trondheim N for 8 years and has been with SAS Institute for 12 years.



Back to main page

Attachments

Home
About PCA
Reference Data Services
Projects
Workgroups