liu.seSearch for publications in DiVA
Change search
Refine search result
1234567 151 - 200 of 3077
Cite
Citation style
• apa
• ieee
• modern-language-association-8th-edition
• vancouver
• oxford
• Other style
More styles
Language
• de-DE
• en-GB
• en-US
• fi-FI
• nn-NO
• nn-NB
• sv-SE
• Other locale
More languages
Output format
• html
• text
• asciidoc
• rtf
Rows per page
• 5
• 10
• 20
• 50
• 100
• 250
Sort
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Created (Oldest first)
• Last updated (Oldest first)
• Disputation date (earliest first)
• Disputation date (latest first)
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Created (Oldest first)
• Last updated (Oldest first)
• Disputation date (earliest first)
• Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
• 151.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques. SINTEF ICT, Trondheim, Norway. SINTEF ICT, Trondheim, Norway. Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
How can the developer benefit from security modeling?2007In: Second International Conference on Availability, Reliability and Security, 2007, IEEE Computer Society, 2007, p. 1017-1025Conference paper (Refereed)

Security has become a necessary part of nearly every software development project, as the overall risk from malicious users is constantly increasing, due to increased consequences of failure, security threats and exposure to threats. There are few projects today where software security can be ignored. Despite this, security is still rarely taken into account throughout the entire software lifecycle; security is often an afterthought, bolted on late in development, with little thought to what threats and exposures exist. Little thought is given to maintaining security in the face of evolving threats and exposures. Software developers are usually not security experts. However, there are methods and tools available today that can help developers build more secure software. Security modeling, modeling of e.g., threats and vulnerabilities, is one such method that, when integrated in the software development process, can help developers prevent security problems in software. We discuss these issues, and present how modeling tools, vulnerability repositories and development tools can be connected to provide support for secure software development

• 152.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
Towards a Structured Unified Process for Software Security2006In: ICSE Workshop on Software Engineering for Secure Systems,2006, ACM , 2006, p. 3-10Conference paper (Refereed)
• 153.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques.
Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
A post-mortem incident modeling method2009In: 2009 International Conference on Availability, Reliability and Security (ARES),  Vol. 1-2, IEEE , 2009, p. 1018-1023Conference paper (Refereed)

Incident post-mortem analysis after recovery from incidents is recommended by most incident response experts. An analysis of why and how an incident happened is crucial for determining appropriate countermeasures to prevent the recurrence of the incident. Currently, there is a lack of structured methods for such an analysis, which would identify the causes of a security incident. In this paper, we present a structured method to perform the post-mortem analysis and to model the causes of an incident visually in a graph structure. This method is an extension of our earlier work on modeling software vulnerabilities. The goal of modeling incidents is to develop an understanding of what could have caused the security incident and how its recurrence can be prevented in the future. The method presented in this paper is intended to be used during the post-mortem analysis of incidents by incident response teams.

• 154.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques.
Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
Integrating a security plug-in with the OpenUP/Basic development process2008In: Third International Conference on Availability, Reliability and Security, 2008, IEEE Computer Society, 2008, p. 284-291Conference paper (Refereed)

In this paper we present a security plug-in for the OpenUP/Basic development process. Our security plug-in is based on a structured unified process for secure software development, named S3P (sustainable software security process). This process provides the formalism required to identify the causes of vulnerabilities and the mitigation techniques that prevent these vulnerabilities. We also present the results of an expert evaluation of the security plug-in. The lessons learned from development of the plug-in and the results of the evaluation will be used when adapting S3P to other software development processes.

• 155.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques.
Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
Introducing Vulnerability Awareness to Common Criteria's Security Targets2009In: The Fourth International Conference on Software Engineering Advances, Portugal, IEEE Computer Society , 2009, p. 419-424Conference paper (Refereed)

Security of software systems has become one of the biggest concerns in our everyday life, since software systems are increasingly used by individuals, companies and governments. One way to help software system consumers gain assurance about the security measures of software products is to evaluate and certify these products with standard evaluation processes. The Common Criteria (ISO/IEC 15408) evaluation scheme is a standard that is widely used by software vendors. This process does not include information about already known vulnerabilities, their attack data and lessons learned from them. This has resulted in criticisms concerning the accuracy of this evaluation scheme since it might not address the areas in which actual vulnerabilities might occur.

In this paper, we present a methodology that introduces information about threats from vulnerabilities to Common Criteria documents. Our methodology improves the accuracy of the Common Criteria by providing information about known vulnerabilities in Common Criteria’s security target. Our methodology also provides documentation about how to fulfill certain security requirements, which can reduce the time for evaluation of the products.

• 156.
Linköping University, Department of Computer and Information Science.
Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science.
Visualisering av kontinuerlig integration2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis

Den här rapporten behandlar ett kandidatarbete som utfördes av åtta stycken studenter från civilingenjörsutbildningarna datateknik och mjukvaruteknik på Tekniska Högskolanvid Linköpings Universitet. Målet med projektet var att utveckla en applikation för att visualisera kontinuerlig integration. Beställning av applikationen gjordes av kunderna Ola Leifler och Kristian Sandahl på Institutionen för Datateknik (IDA) på Linköpings universitet å Software Centers vägnar.

Utvecklingsarbetet utfördes enligt den agila utvecklingsmetodiken Scrum med diverseanpassningar. Under utvecklingsarbetets gång gjordes olika sorters prototyper för att säkerställa vilka krav kunden hade på applikationen samt att projektgruppens och kundenstankar om applikationen överensstämde.

Projektet resulterade i en applikation som visualiserar kontinuerlig integration i tre olika nivåer. Projektgruppen erhöll även erfarenheter inom utveckling av mjukvara från start till leverans, hur man reder ut en kunds krav med hjälp av prototyper samt gruppdynamik i en projektgrupp.

Rapporten innehåller åtta stycken individuella bidrag där varje projektmedlem har skrivit en rapportdel om en erfarenhet eller fördjupning inom ett område kopplad till sin projektroll eller utvecklingen.

fulltext
• 157.
Stony Brook University, USA,.
Computer Science, the University of Arizona, USA,. Institute of Computer Science, Universitat Bayreuth, Germany,. Stony Brook University, USA,. Linköping University, Department of Science and Technology, Communications and Transport Systems. Linköping University, Faculty of Science & Engineering. Institute of Computer Science, Freie Universitat Berlin, Germany,. Institute of Computer Science, Freie Universitat Berlin, Germany,. Department of Computer Science, University of Finland.
Shortest path to a segment and quickest visibility queries2016In: LIPIcs-Leibniz International Proceedings in Informatics, 2016, Vol. 7, p. 77-100Conference paper (Refereed)

We show how to preprocess a polygonal domain with a xed starting point s in order to answer eciently the following queries: Given a point q, how should one move from s in order to see q as soon as possible? This query resembles the well-known shortestpath- to-a-point query, except that the latter asks for the fastest way to reach q, instead of seeing it. Our solution methods include a data structure for a di erent generalization of shortest-path-to-a-point queries, which may be of independent interest: to report eciently a shortest path from s to a query segment in the domain.

• 158. Arlitt, Martin
Carlsson, NiklasRolia, Jerry
Proceedings of the Third GreenMetrics '11 Workshop, in conjunction with (and sponsored by) ACM SIGMETRICS.: ACM Performance Evaluation Review (PER), Special Issue on the 2011 GreenMetrics Workshop.  Volume 39, Issue 3, December 2011.2011Conference proceedings (editor) (Refereed)
• 159.
HP Labs; University of Calgary, Canada.
Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology. University of Calgary, Canada. HP Labs.
Passive Crowd-based Monitoring of World Wide Web Infrastructure and its Performance2012In: Proc. IEEE International Conference on Communications (ICC 2012), IEEE , 2012, p. 2689-2694Conference paper (Refereed)

The World Wide Web and the services it provides are continually evolving. Even for a single time instant, it is a complex task to methodologically determine the infrastructure over which these services are provided and the corresponding effect on user perceived performance. For such tasks, researchers typically rely on active measurements or large numbers of volunteer users. In this paper, we consider an alternative approach, which we refer to as passive crowd-based monitoring. More specifically, we use passively collected proxy logs from a global enterprise to observe differences in the quality of service (QoS) experienced by users on different continents. We also show how this technique can measure properties of the underlying infrastructures of different Web content providers. While some of these properties have been observed using active measurements, we are the first to show that many of these properties (such as location of servers) can be obtained using passive measurements of actual user activity. Passive crowd-based monitoring has the advantages that it does not add any overhead on Web infrastructure, it does not require any specific software on the clients, but still captures the performance and infrastructure observed by actual Web usage.

• 160.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
RESTful Mobile Application for Android: Mobile Version of Inspectera Online2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Web service-based mobile applications have become emergent in the recent years. Representational State Transfer (REST) architecture style introduced the concept of Resource Oriented Architecture (ROA), which has been widely used for building applications for all platforms. This master’s thesis designs and develops a Web service-based mobile application for Android platform following the constraints of REST architectural style. It also proposes an authentication model for RESTful applications. The master’s thesis is completed at the company Inspectera HK AB in Norrköping, Sweden. The developed application is called the “Mobile version of Inspectera Online.”

104772_Master_Thesis_Arman
• 161. Buy this publication >>
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Automatic Parallelization of Equation-Based Simulation Programs2006Doctoral thesis, monograph (Other academic)

Modern equation-based object-oriented modeling languages which have emerged during the past decades make it easier to build models of large and complex systems. The increasing size and complexity of modeled systems requires high performance execution of the simulation code derived from such models. More efficient compilation and code optimization techniques can help to some extent. However, a number of heavy-duty simulation applications require the use of high performance parallel computers in order to obtain acceptable execution times. Unfortunately, the possible additional performance offered by parallel computer architectures requires the simulation program to be expressed in a way that makes the potential parallelism accessible to the parallel computer. Manual parallelization of computer programs is generally a tedious and error prone process. Therefore, it would be very attractive to achieve automatic parallelization of simulation programs.

This thesis presents solutions to the research problem of finding practically usable methods for automatic parallelization of simulation codes produced from models in typical equationbased object-oriented languages. The methods have been implemented in a tool to automatically translate models in the Modelica modeling language to parallel codes which can be efficiently executed on parallel computers. The tool has been evaluated on several application models. The research problem includes the problem of how to extract a sufficient amount of parallelism from equations represented in the form of a data dependency graph (task graph), requiring analysis of the code at a level as detailed as individual expressions. Moreover, efficient clustering algorithms for building clusters of tasks from the task graph are also required. One of the major contributions of this thesis work is a new approach for merging fine-grained tasks by using a graph rewrite system. Results from using this method show that it is efficient in merging task graphs, thereby decreasing their size, while still retaining a reasonable amount of parallelism. Moreover, the new task-merging approach is generally applicable to programs which can be represented as static (or almost static) task graphs, not only to code from equation-based models.

An early prototype called DSBPart was developed to perform parallelization of codes produced by the Dymola tool. The final research prototype is the ModPar tool which is part of the OpenModelica framework. Results from using the DSBpart and ModPar tools show that the amount of parallelism of complex models varies substantially between different application models, and in some cases can produce reasonable speedups. Also, different optimization techniques used on the system of equations from a model affect the amount of parallelism of the model and thus influence how much is gained by parallelization.

FULLTEXT01
COVER01
• 162. Buy this publication >>
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Automatic Parallelization of Simulation Code from Equation Based Simulation Languages2002Licentiate thesis, monograph (Other academic)

Modern state-of-the-art equation based object oriented modeling languages such as Modelica have enabled easy modeling of large and complex physical systems. When such complex models are to be simulated, simulation tools typically perform a number of optimizations on the underlying set of equations in the modeled system, with the goal of gaining better simulation performance by decreasing the equation system size and complexity. The tools then typically generate efficient code to obtain fast execution of the simulations. However, with increasing complexity of modeled systems the number of equations and variables are increasing. Therefore, to be able to simulate these large complex systems in an efficient way parallel computing can be exploited.

This thesis presents the work of building an automatic parallelization tool that produces an efficient parallel version of the simulation code by building a data dependency graph (task graph) from the simulation code and applying efficient scheduling and clustering algorithms on the task graph. Various scheduling and clustering algorithms, adapted for the requirements from this type of simulation code, have been implemented and evaluated. The scheduling and clustering algorithms presented and evaluated can also be used for functional dataflow languages in general, since the algorithms work on a task graph with dataflow edges between nodes.

Results are given in form of speedup measurements and task graph statistics produced by the tool. The conclusion drawn is that some of the algorithms investigated and adapted in this work give reasonable measured speedup results for some specific Modelica models, e.g. a model of a thermofluid pipe gave a speedup of about 2.5 on 8 processors in a PC-cluster. However, future work lies in finding a good algorithm that works well in general.

FULLTEXT01
• 163.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Extendable Physical Unit Checking with Understandable Error Reporting2009In: Proceedings of the 7th International Modelica Conference, Como, Italy, 20-22 September 2009, Linköping: Linköping University Electronic Press, Linköpings universitet , 2009, p. 890-897Conference paper (Refereed)

Dimensional analysis and physical unit checking are important tools for helping users to detect and correct mistakes in dynamic mathematical models. To make tools useful in a broad range of domains, it is important to also support other units than the SI standard. For instance, such units are common in biochemical or financial modeling. Furthermore, if two or more units turn out be in conflict after checking, it is vital that the reported unit information is given in an understandable format for the user, e.g., “N.m” should preferably be shown instead of “m2.kg.s-2”, even if they represent the same unit. Presently, there is no standardized solution to handle these problems for Modelica models. The contribution presented in this paper is twofold. Firstly, we propose an extension to the Modelica language that makes it possible for a library designer to define both new base units and derived units within Modelica models and packets. Today this information is implicitly defined in the specification. Secondly, we describe and analyze a solution to the problem of presenting units to users in a more convenient way, based on an algorithm using Mixed Integer Programming (MIP). Both solutions are implemented, tested, and illustrated with several examples.

• 164.
Linköping University, Department of Computer and Information Science.
Linköping University, Department of Computer and Information Science.
Multiprocessor Scheduling of Simulation Code from Modelica Models2002Conference paper (Refereed)
• 165.
Linköping University, Department of Computer and Information Science.
Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science.
• 166. Aroyo, Lora
Welty, ChrisAlani, HarithTaylor, JamieBernstein, AbrahamKagal, LalanaNoy, Natasha FridmanBlomqvist, EvaLinköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
The Semantic Web - ISWC 2011 - 10th International Semantic Web Conference, Bonn, Germany, October 23-27, 2011, Proceedings, Part I2011Conference proceedings (editor) (Other academic)
• 167. Aroyo, Lora
Welty, ChrisAlani, HarithTaylor, JamieBernstein, AbrahamKagal, LalanaNoy, Natasha FridmanBlomqvist, EvaLinköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
The Semantic Web - ISWC 2011 - 10th International Semantic Web Conference, Bonn, Germany, October 23-27, 2011, Proceedings, Part II2011Conference proceedings (editor) (Other academic)
• 168. Buy this publication >>
Linköping University, Department of Computer and Information Science, KPLAB - Knowledge Processing Lab. Linköping University, The Institute of Technology.

The number of domains and tasks where information extraction tools can be used needs to be increased. One way to reach this goal is to construct user-driven information extraction systems where novice users are able to adapt them to new domains and tasks. To accomplish this goal, the systems need to become more intelligent and able to learn to extract information without need of expert skills or time-consuming work from the user.

The type of information extraction system that is in focus for this thesis is semistructural information extraction. The term semi-structural refers to documents that not only contain natural language text but also additional structural information. The typical application is information extraction from World Wide Web hypertext documents. By making effective use of not only the link structure but also the structural information within each such document, user-driven extraction systems with high performance can be built.

The extraction process contains several steps where different types of techniques are used. Examples of such types of techniques are those that take advantage of structural, pure syntactic, linguistic, and semantic information. The first step that is in focus for this thesis is the navigation step that takes advantage of the structural information. It is only one part of a complete extraction system, but it is an important part. The use of reinforcement learning algorithms for the navigation step can make the adaptation of the system to new tasks and domains more user-driven. The advantage of using reinforcement learning techniques is that the extraction agent can efficiently learn from its own experience without need for intensive user interactions.

An agent-oriented system was designed to evaluate the approach suggested in this thesis. Initial experiments showed that the training of the navigation step and the approach of the system was promising. However, additional components need to be included in the system before it becomes a fully-fledged user-driven system.

FULLTEXT01
• 169.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
Intelligent semi-structured information extraction: a user-driven approach to information extraction2005Doctoral thesis, monograph (Other academic)

The number of domains and tasks where information extraction tools can be used needs to be increased. One way to reach this goal is to design user-driven information extraction systems where non-expert users are able to adapt them to new domains and tasks. It is difficult to design general extraction systems that do not require expert skills or a large amount of work from the user. Therefore, it is difficult to increase the number of domains and tasks. A possible alternative is to design user-driven systems, which solve that problem by letting a large number of non-expert users adapt the systems themselves. To accomplish this goal, the systems need to become more intelligent and able to learn to extract with as little given information as possible.

The type of information extraction system that is in focus for this thesis is semi-structured information extraction. The term semi-structured refers to documents that not only contain natural language text but also additional structural information. The typical application is information extraction from World Wide Web hypertext documents. By making effective use of not only the link structure but also the structural information within each such document, user-driven extraction systems with high performance can be built.

There are two different approaches presented in this thesis to solve the user-driven extraction problem. The first takes a machine learning approach and tries to solve the problem using a modified $Q(\lambda)$ reinforcement learning algorithm. A problem with the first approach was that it was difficult to handle extraction from the hidden Web. Since the hidden Web is about 500 times larger than the visible Web, it would be very useful to be able to extract information from that part of the Web as well. The second approach is called the hidden observation approach and tries to also solve the problem of extracting from the hidden Web. The goal is to have a user-driven information extraction system that is also able to handle the hidden Web. The second approach uses a large part of the system developed for the first approach, but the additional information that is silently obtained from the user presents other problems and possibilities.

An agent-oriented system was designed to evaluate the approaches presented in this thesis. A set of experiments was conducted and the results indicate that a user-driven information extraction system is possible and no longer just a concept. However, additional work and research is necessary before a fully-fledged user-driven system can be designed.

• 170.
CSC, KTH.
Linköping University, Department of Computer and Information Science, MDI - Interaction and Service Design Research Group. Linköping University, The Institute of Technology. CSC, KTH. CSC, KTH. CSC, KTH. CSC, KTH.
Acquisition of usable IT: Acquisition projects to reflect on2009Report (Other academic)

By examining how several organizations have gone through the process of procuring IT systems, we have seen that there is a great need for procurer organizations themselves to understand their role in systems development. What is their responsibility for the outcome of the acquisition process? What is their responsibility for the outcome of the system-in-use? Can they actually take responsibility for the usability of systems? This collection of papers is meant to be a starting point for procurer organizations to reflect on that responsibility, as well as on how they manage the acquisition process. The papers are informed by academic research and grounded in scientific studies, but they are also to be taken as practical efforts to describe the process. We hope they will nurture reflection, and encourage those who are taking a stand to make IT systems usable. Our assumption is that the sooner an organization comes to terms with how the future system will actually be used, the sooner it will be profitable or beneficial.

• 171. Buy this publication >>
Linköping University, Department of Computer and Information Science, MDI - Interaction and Service Design Research Group. Linköping University, The Institute of Technology.
Good to use!: Use quality of multi-user applications in the home2003Licentiate thesis, monograph (Other academic)

Traditional models of usability are not sufficient for software in the home, since they are built with office software in mind. Previous research suggest that social issues among other things, separate software in homes from software in offices. In order to explore that further, the use qualities to design for, in software for use in face-to-face meetings at home were contrasted to such systems at offices. They were studied using a pluralistic model of use quality with roots in socio-cultural theory, cognitive systems engineering, and architecture. The research approach was interpretative design cases. Observations, situated interviews, and workshops were conducted at a Swedish bank, and three interactive television appliances were designed and studied in simulated home environments. It is concluded that the use qualities to design for in infotainment services on interactive television are laidback interaction, togetherness among users, and entertainment. This is quite different from bank office software that usually is characterised by not only traditional usability criteria such as learnability, flexibility, effectiveness, efficiency, and satisfaction, but also professional face management and ante-use. Ante-use is the events and activities that precedes the actual use that will set the ground for whether the software will have quality in use or not. Furthermore, practices for how to work with use quality values, use quality objectives, and use quality criteria in the interaction design process are suggested. Finally, future research in design of software for several co-present users is proposed.

FULLTEXT01
• 172.
Linköping University, Department of Computer and Information Science.
Optimizing Mobile Phone Free Fall Drop Test Equipment - Precision, Repeatability, and Time Efficiency2009Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis

Free fall drop testing is an important part of the development of commercial electronic components and devices. In the process of optimizing the quality of their entire product range, Sony Ericsson Mobile Communications AB have decided to review their free fall drop test equipment with the goal of increasing the precision, repeatability, and time efficiency of their drop test applications. In regard to the free fall drop test principle a robot system with management software, named Doris Drop Test System, is developed to meet these goals.

As the amount of related work for this application is as minimal as the timeframes for this project, the development process is empirical and entrepreneurial with engineering skills as the governing line of work. Combining the competence from fields such as mechanics, electronics and product development, reaching the goals is successful enabling the identifying of two different drop methods – Impact Position and Drop Position. Increasing the repeatability from approximately 10% to 85% enables anyone at any time to perform the exact mobile phone drop test. By reaching a precision of up to 100%, performing free fall drop tests aiming for testing specific mobile phone parts, optimizes the development process by faster detection of mechanical weaknesses. Achieving these results in parallel with increasing the throughput by shortening the testing time, has proven the success of the Doris Drop Test System.

FULLTEXT01
• 173.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Design and Implementation of a User Friendly OpenModelica Graphical Connection Editor2010Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

OpenModelica (www.openmodelica.org) is an open-source Modelica-based modeling and simulation environment intended for industrial as well as academic usage. Its long-term development is supported by a non-profit organization – the Open Source Modelica Consortium OSMC, where Linköping University is a member.The main reason behind this thesis was the need for a user friendly, efficient and modular OpenModelica graphical connection editor. The already existing open source editors were either textual or not so user friendly. As a part of this thesis work a new open source Qt-based cross platform graphical user interface was designed and implemented, called OMEdit, partially based on an existing GUI for hydraulic systems, HOPSAN. The usage of Qt C++ libraries makes this tool more future safe and also allows it to be easily integrated into other parts of the OpenModelica platform.This thesis aims at developing an advanced open source user friendly graphical user interface that provides the users with easy-to-use model creation, connection editing, simulation of models, and plotting of results. The interface is extensible enough to support user-defined extensions/models. Models can be both textual and graphical. From the annotation information in the Modelica models (e.g. Modelica Standard Library components) a connection tree and diagrams can be created. The communication to the OpenModelica Compiler (OMC) Subsystem is performed through a Corba client-server interface. The OMC Corba server provides an interactive API interface. The connection editor will function as the front-end and OMC as the backend. OMEdit communicates with OMC through the interactive API interface, requests the model information and creates models/connection diagrams based on the Modelica annotations standard version 3.2.

FULLTEXT01
• 174.
Linköping University, Department of Computer and Information Science.
Customization of Docbook to Generate PDF, HTM & CHM2009Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Software documentation is an important aspect of software projects. Software documentation plays a key role in software

development if it is up-to-date and complete. Software documentation should have the synchronization with the software

development. One of the problems is duplication; same information is written in different documents and stored in different

places with different formats making things complex to manage. By using traditional documentation tools, it’s hard to maintain

documentation for complex systems and it is time consuming.

To overcome these problems, we have used XML Docbook that is a good solution for it. Docbook provides single sourcing

technique in which documents are written ideally in one place and can convert it into different other formats from the same

location. Actually docbook is based on xml which can be easily edited by most of the programming languages. If there are many

developers are writing documentation for their software modules then we don’t need to copy and paste all the documents into

one document to produce a complete document for the software product. We have to just add the references to all those files

that should be present in the final document and then compile it with some processors and it automatically get document

contents from all files and put it into one document, so it’s easy to handle and maintain software documentation with docbook.

FULLTEXT01
• 175.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
The roles of IT: studies of organising when implementing and using enterprise systems2004Doctoral thesis, comprehensive summary (Other academic)

This study concerns implementation and use of enterprise systems (ERP systems) in complex organisations. The purpose of this thesis is to problematise and understand the social organising of information technology in organisations, by studying the implementation and use of enterprise system. This is done by using a multi-theoretical perspective and studying cases of complex organisations with a qualitative and interpretive research method.

The study manages to give a more profound understanding of the roles of the technology. It is found that the enterprise systems act as Bureaucrat, Manipulator, Administrative assistant, Consultant or is dismissed, in the sense that intended users chose to avoid using them. These roles of information technology are formed in a rather complex organising process. A Structuration Theory Analytical Model and Procedure (STAMP) is developed, that serves to illuminate the dynamic relationships of individuals' or groups' interpretations, power and norms and how that affects the implementation and use of enterprise systems. The roles were also found to be different for individuals in similar work conditions. This was due to how they learned their job, what understanding of the job they developed, and what competences they developed. The different kinds of competences found, requested different support from the technology and it also made the individuals take a different approach towards how to use the technology. The study also explores why emotions appear and what they affect, and identifies patterns of emotions and emotional transitions that appear during implementation and use of an enterprise system.

The social aspect of using technology is in focus in this thesis. And thus, the technology is not just a tool to make excellent use of; it becomes something more - an actor with different roles. The main contribution is the development of a language and an approach to how to understand the use and implementation of enterprise systems.

• 176.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, EISLAB - Economic Information Systems.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, EISLAB - Economic Information Systems.
An Emotional analysis of an ERP system implementation2004In: The roles of IT: studies of organising when implementing and using enterprise systems / [ed] Peter Aronsson, Linköping: Linköpings universitet , 2004, p. 131-182Chapter in book (Other academic)
• 177.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, EISLAB - Economic Information Systems.
Linköping University, The Institute of Technology. Linköping University, Department of Management and Engineering.
Five Roles of an Information System: A Social Constructionist Approach to Analysing the Use of ERP Systems2003In: Informing Science, ISSN 1547-9684, E-ISSN 1521-4672, Vol. 6, p. 209-220Article in journal (Refereed)

This paper presents a novel way of thinking about how information systems are used in organisations. Traditionally, computerised information systems are viewed as objects. In contrast, by viewing the information system as an actor, the understanding of the structuration process increases. The user, being influenced by the ERP (Enterprise Resource Planning) system and giving it an actor role, thereby also confers agency on the ERP system; through its very use it influences actions and thus also the structure. Based on a case study of ERP use in an ABB company over a decade, five different roles played by the ERP systems were identified. The ERP systems acted as Bureaucrat, Manipulator, Administrative assistant, Consultant or were dismissed (Dismissed) in the sense that intended users chose to avoid using them. These terms are defined in the full text. The purpose of this approach here is not to -animate- the information systems, to give them life or a mind of their own, but rather to make explicit the socially constructed roles conferred on them by users and others who are affected by them. On this basis, it is possible to suggest how the roles can help us open up new areas of exploration concerning the fruitful use of IT.

• 178.
Linköping University, Department of Computer and Information Science.
A Model-Based Approach for Reliability Prediction2010Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

When developing products, reliability is an important factor that has to be considered. For safety critical systems it is important to know the probability that an item will perform a required function without failure under stated conditions for a stated period of time. The main goal of a reliability prediction analysis is to predict the rate at which the product of a system will fail. To perform this prediction there are a number of methodologies available.

This Master Thesis proposes a model-based approach for reliability prediction calculations based on the physics of failure and supported by analysis of test-data field returns and physical models provided by the FIDES methodology. FIDES based reliability models have been integrated into a model-based diagnosis environment for seamless integration with other safety assessment analysis.

The model-based diagnosis environment used in this thesis is model-based reasoner RODON developed by Uptime Solutions AB. Components that uses the FIDES methodology have been developed in RODON, where components can be combined to systems by drag and drop method. Usage profiles that are defined according to the FIDES methodology in RODON are not system specific, which makes them reusable in other systems. The developed library of components and usage profiles makes it easy to model complex systems and perform reliability predictions according to the FIDES methodology.

FULLTEXT01
• 179.
Linköping University, Department of Computer and Information Science.
Determining the feasibility of automatically translating SMILE to a Java framework2008Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

MTsim (Mobile Traffic Simulator) is an Ericsson AB internal software application that is part of 2Gsim. It is used to simulate elements of a GSM (Global System for Mobile communications) network for feature testing and automated testing. It is written in the programming language TSS Language, also known as SMILE which is a proprietary Ericsson programming language. SMILE is based on the principles of state matrix programming which in essence means that each program is on its own a finite state machine. The language is old and was originally intended as a macro language for smaller test programs, not for applications the size of MTsim.

It is of interest to evaluate the feasibility of performing an automatic conversion of applications written in SMILE, with special interest in converting MTsim, to a Java framework since Java has many advantages compared to SMILE. Java, as a language, is well suited for larger applications, there are numerous well supported tools and there is a much wider spread competence than there is for SMILE.

It is clear that in order to do a full conversion of a SMILE program to a Java framework two applications must be implemented. First a Java framework, which acts as a run time environment, must be designed which can host the translated programs. The other part is an actual translator which takes a SMILE program as input and ouputs a translated Java program. A more sophisticated framework is preferred since it makes the actual translated programs more light weight and easy to read which means higher degree of maintainability.

There are different ways to implement state machines in Java but the most flexible and versatile is to implement it as a black-box framework in an object oriented way where the framework has sophisticated mechanisms for message and event handling which is central to any state machine framework.

The translation for SMILE can easily be done by using a AST (abstract syntax tree) representation, which is a full representation of the SMILE program in tree-form. The AST is obtained from an intermediate state of the SMILE program compiler.

FULLTEXT01
• 180.
Linköping University, Department of Computer and Information Science.
Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science. Linköping University, Department of Computer and Information Science.
Sökmotoroptimering för en e-handelsplattform2019Independent thesis Basic level (degree of Bachelor), 12 credits / 18 HE creditsStudent thesis

The rapid pace of digitization in this day and age drives major changes in consumer buying behaviour. More and more consumers make use of the internet and search engines when making purchases. E-commerce services have thus become increasingly dependent on search engines, and on their ranking on them. Optimization of e-commerce websites are a highly relevant aspect for website traffic, however, little information on how different types of SEO-optimization techniques affect page ranking is disclosed by search engine companies. In this context, this study is conducted in order to assess how an e-commerce, brewinabox.se, website's page ranking is affected by three ranking factors: page speed, keyword density and site structure. It illustrates how to optimize with respect to these ranking factors. The study tests different versions of the e-commerce web application with different adjustments of the ranking factors, in relation to page rank. The tests did not show any clear results, and they could not confirm the theories and methods suggested by common literature for each factor. The weak results might depend on insufficient tests with many possible errors, especially concerning the indexing process of Google. Ultimately, a final version of the web application was implemented using the recommendations and a clearly positive result of the search engine optimization was noted.

fulltext
• 181.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Automatically proving the correctness of vehicle coordination2018In: ICT Express, ISSN 2405-9595, Vol. 4, no 1, p. 51-54Article in journal (Refereed)

In the next generation of road-based transportation systems, where vehicles exchange information and coordinate their actions, a major challenge will be to ensure that the interaction rules are safe and lead to progress. In this paper we address the problem of automatically verifying the correctness of such distributed vehicular coordination protocols. We propose a novel modeling approach for communicating mobile entities based on the concept of satisfiability modulo theories (SMT). We apply this method to an intersection collision avoidance protocol and show how the method can be used to investigate the settings under which such a protocol achieves safety and progress.

fulltext
• 182.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Combining Detection and Verification for Secure Vehicular Cooperation Groups2019In: ACM Transactions on Cyber-Physical Systems, ISSN 2378-962X, Vol. 4, no 1Article in journal (Refereed)

Coordinated vehicles for intelligent traffic management are instances of cyber-physical systems with strict correctness requirements. A key building block for these systems is the ability to establish a group membership view that accurately captures the locations of all vehicles in a particular area of interest. In this article, we formally define view correctness in terms of soundness and completeness and establish theoretical bounds for the ability to verify view correctness. Moreover, we present an architecture for an online view detection and verification process that uses the information available locally to a vehicle. This architecture uses an SMT solver to automatically prove view correctness (if possible). We evaluate this architecture using both synthetic and trace-based scenarios and demonstrate that the ability to verify view correctness is on par with the ability to detect view violations.

• 183.
Linköping University, Department of Computer and Information Science.
En optimierande kompilator för SMV till CLP(B)2005Independent thesis Basic level (professional degree)Student thesis

This thesis describes an optimising compiler for translating from SMV to CLP(B). The optimisation is aimed at reducing the number of required variables in order to decrease the size of the resulting BDDs. Also a partitioning of the transition relation is performed. The compiler uses an internal representation of a FSM that is built up from the SMV description. A number of rewrite steps are performed on the problem description such as encoding to a Boolean domain and performing the optimisations.

The variable reduction heuristic is based on finding sub-circuits that are suitable for reduction and a state space search is performed on those groups. An evaluation of the results shows that in some cases the compiler is able to greatly reduce the size of the resulting BDDs.

FULLTEXT01
• 184.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Model-based Membership Verification in Vehicular Platoons2015In: Proceedings: 2015 45th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops , IEEE Computer Society, 2015, p. 125-132Conference paper (Refereed)

Cooperative vehicular systems have the potentialto significantly increase traffic efficiency and safety. However,they also raise the question of to what extent information thatis received from other vehicles can be trusted. In this paperwe present a novel approach for increasing the trustworthinessof cooperative driving through a model-based approach forverifying membership views in vehicular platoons. We define aformal model for platoon membership, cooperative awarenessclaims, and membership verification mechanisms. With the helpof a satisfiability solver, we are able to quantitatively analysethe impact of different system parameters on the verifiability ofreceived information. Our results demonstrate the importance ofcross validating received messages, as well as the surprising diffi-culty in establishing correct membership views despite powerfulverification mechanisms.

fulltext
• 185.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
Poster: Securing Vehicular Platoon Membership2014In: Proceedings of IEEE Vehicular Networking Conference (VNC), IEEE Computer Society, 2014, p. 119-120Conference paper (Refereed)

Vehicular platoons have the potential to bring considerable fuel-savings and increase traffic efficiency. A key component for the next generation platoon systems is a secure membership component which can accommodate membership changes in a dynamic and potentially hostile environment. In this poster paper we analyse the conditions for creating a secure membership protocol which is resilient to attacks and faults in the communication protocols.

fulltext
• 186. Buy this publication >>
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Restoring Consistency after Network Partitions2007Licentiate thesis, monograph (Other academic)

The software industry is facing a great challenge. While systems get more complex and distributed across the world, users are becoming more dependent on their availability. As systems increase in size and complexity so does the risk that some part will fail. Unfortunately, it has proven hard to tackle faults in distributed systems without a rigorous approach. Therefore, it is crucial that the scientific community can provide answers to how distributed computer systems can continue functioning despite faults.

Our contribution in this thesis is regarding a special class of faults which occurs whennetwork links fail in such a way that parts of the network become isolated, such faults are termed network partitions. We consider the problem of how systems that have integrity constraints on data can continue operating in presence of a network partition. Such a system must act optimistically while the network is split and then perform a some kind of reconciliation to restore consistency afterwards.

We have formally described four reconciliation algorithms and proven them correct. The novelty of these algorithms lies in the fact that they can restore consistency after network partitions in a system with integrity constraints and that one of the protocols allows the system to provide service during the reconciliation. We have implemented and evaluated the algorithms using simulation and as part of a partition-tolerant CORBA middleware. The results indicate that it pays oﬀ to act optimistically and that it is worthwhile to provide service during reconciliation.

FULLTEXT01
COVER01
• 187.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Computer and Information Science. Linköping University, Faculty of Science & Engineering. Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
In-store payments using Bitcoin2018In: 9th IFIP International Conference on New Technologies, Mobility and Security (NTMS), IEEE, 2018Conference paper (Refereed)

The possibility of in-store payments would further increase the potential usefulness of cryptocurrencies. However, this would require much faster transaction verification than current solutions provide (one hour for Bitcoin) since customers are likely not prepared to wait a very long time for their purchase to be accepted by a store. We propose a solution for enabling in-store payments with waiting times in the order of a few seconds, which is still compatible with the current Bitcoin protocol. The idea is based on a payment card in combination with a protocol for ensuring that losing a card does not mean losing the money on it. We analyse the required transaction verification delay and also the potentially added risks that the solution brings compared to current systems.

fulltext
• 188.
Linköping University, Department of Computer and Information Science. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering. Auronatics Institute of Technology, Brazil.
Specification, Implementation and Verification of Dynamic Group Membership for Vehicle Coordination2017In: Dependable Computing (PRDC), 2017 IEEE 22nd Pacific Rim International Symposium on, IEEE, 2017, p. 321-328Conference paper (Refereed)

New advanced traffic management solutions with fully or semi-autonomous vehicles that communicate over a wireless interface to coordinate their driving decisions create new challenges in distributed computing. In this paper we address the problem of dynamic group membership in three stages. First, we propose three criteria to specify correctness and performance of the group views created by such algorithms in terms of soundness, completeness and freshness. Second, we develop a group membership protocol tailored for vehicular coordination. Finally, we show through simulation and model-based verification that the protocol does indeed meet the criteria and provide at least 95% perfect group membership views under as adverse conditions as 70% packet loss or very high churn rate.

fulltext
• 189.
Trinity College Dublin.
Trinity College Dublin. Trinity College Dublin. Trinity College Dublin. Trinity College Dublin.
A Formal Approach to Autonomous Vehicle Coordination2012In: FM 2012: Formal Methods: 18th International Symposium, Paris, France, August 27-31, 2012. Proceedings / [ed] Dimitra Giannakopoulou and Dominique Méry, Springer Berlin/Heidelberg, 2012, p. 52-67Chapter in book (Refereed)

Increasing demands on safety and energy efficiency will require higher levels of automation in transportation systems. This involves dealing with safety-critical distributed coordination. In this paper we demonstrate how a Satisfiability Modulo Theories (SMT) solver can be used to prove correctness of a vehicular coordination problem. We formalise a recent distributed coordination protocol and validate our approach using an intersection collision avoidance (ICA) case study. The system model captures continuous time and space, and an unbounded number of vehicles and messages. The safety of the case study is automatically verified using the Z3 theorem prover.

• 190.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Formalising Reconciliation in Partitionable Networks with Distributed Services2006In: Rigorous Development of Complex Fault-Tolerant Systems / [ed] Michael Butler, Cliff Jones, Alexander Romanovsky, Elena Troubitsyna, Heidelberg: Springer Verlag , 2006, p. 37-58Chapter in book (Refereed)

This book brings together 19 papers focusing on the application of rigorous design techniques to the development of fault-tolerant, software-based systems. It is an outcome of the REFT 2005 Workshop on Rigorous Engineering of Fault-Tolerant Systems held in conjunction with the Formal Methods 2005 conference at Newcastle upon Tyne, UK, in July 2005.

• 191.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Post-Partition Reconciliation Protocols for Maintaning Consistency2006In: SAC '06 Proceedings of the 2006 ACM symposium on Applied computing, New York, NY, USA: ACM Press, 2006, p. 710-717Conference paper (Refereed)

This paper addresses design exploration for protocols that are employed in systems with availability-consistency trade-offs. Distributed data is modelled as states of objects replicated across a network, and whose updates require satisfaction of integrity constraints over multiple objects. Upon detection of a partition, such a network will continue to provide delivery of services in parallel partitions; but only for updates with non-critical integrity constraints. Once the degraded mode ends, the parallel network partitions are reconciled to arrive at one partition. Using a formal treatment of the reconciliation process, three algorithms are proposed and studied in terms of their influence on service outage duration. The longer the reconciliation time, the lower is system availability; since the interval in which no services are provided is longer. However, the reconciliation time in turn is affected by the time to construct the post-partition system state. The shorter the construction time the higher is the number of updates that took place in the degraded mode but that will not be taken up in the reconciled partition. This will lead to a longer interval for rejecting/redoing these operations and thereby increase reconciliation time.

• 192.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory.
Random walk gossip-based manycast with partition detection2008In: Supplemental Proceedings of the International Conference on Dependable Systems and Networks, DSN, 2008, IEEE Computer Society , 2008, p. G40-G41Conference paper (Other academic)

• 193.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
Rapid selection and dissemination of urgent messages over delay-tolerant networks (DTNs)2015In: Advances in Delay-Tolerant Networks (DTNs): Architecture and Enhanced Performance / [ed] J. Rodrigues, Elsevier, 2015, p. 187-203Chapter in book (Refereed)

Today many new applications are emerging that take advantage of wireless communication in handheld and embedded devices. Some of these emerging applications, such as information sharing in vehicular systems, have strong requirements for timely message dissemination, even if the network is not always 100% connected. In this chapter we discuss message differentiation mechanisms that can be used in intermittently connected networks to improve delivery and latency properties when messages have a limited time to live in the network. We present a simulation-based study on a large-scale vehicular scenario comparing different prioritisation mechanisms for a partition-tolerant manycast protocol. We show that negative effects of overloads can be significantly reduced by using information within the message about how far it has spread and how much time is remaining.

• 194.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Linköping University, The Institute of Technology. Instituto Tecnolgico Informtica Universidad Politcnica de Valencia, Spain. Instituto Tecnolgico Informtica Universidad Politcnica de Valencia, Spain.
Measuring Availability in Optimistic Partition-Tolerant Systems with Data Constraints2007In: Dependable Systems and Networks, DSN 2007, IEEE Computer Society, 2007, p. 656-665Conference paper (Refereed)

Replicated systems that run over partitionable environments, can exhibit increased availability if isolated partitions are allowed to optimistically continue their execution independently. This availability gain is traded against consistency, since several replicas of the same objects could be updated separately. Once partitioning terminates, divergences in the replicated state needs to be reconciled. One way to reconcile the state consists of letting the application manually solve inconsistencies. However, there are several situations where automatic reconciliation of the replicated state is meaningful. We have implemented replication and automatic reconciliation protocols that can be used as building blocks in a partition-tolerant middleware. The novelty of the protocols is the continuous service of the application even during the reconciliation process. A prototype system is experimentally evaluated to illustrate the increased availability despite network partitions.

• 195.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, RTSLAB - Real-Time Systems Laboratory. Swedish National Defence College.
Emerging Information Infrastructures: Cooperation in Disasters2009In: Critical Information Infrastructure Security: Third International Workshop, CRITIS 2008, Rome, Italy, October13-15, 2008. Revised Papers / [ed] Roberto Setola, Stefan Geretshuber, Springer Berlin/Heidelberg, 2009, p. 258-270Conference paper (Refereed)

In this paper we describe how to include high level semantic information, such as aesthetics and emotions; into Content Based Image Retrieval. We present a, color-based emotion-related image descriptor that can be used for describing the emotional content of images. The color emotion metric used is derived from psychophysical experiments rind based oil three variables: activity, weight and teat. It was originally designed for single-colors, bill, recent research has shown that the salve emotion estimates call be applied in the retrieval of multi-colored images. Here we describe a new approach, based oil the assumption that perceived color emotions in images are mainly affected by homogenous regions, defined by the emotion metric; and transitions between regions. RGB coordinates are converted to emotion coordinates, mid for each emotion channel, statistical measurements of gradient magnitudes within a. stack of low-pass filtered images are used for finding interest; points corresponding to homogeneous regions and transitions between regions. Emotion characteristics are derived for patches surrounding cacti interest, point, and saved in a, bag-of-emotions; that for instance, can be used for retrieving images based oil emotional content.

• 196.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
MDA - Foundations and Applications (MDAFA) 20042004In: MDA - Foundations and Applications MDAFA 2004,2004, Linköping, Sweden: Linköpings universitet , 2004Conference paper (Refereed)
• 197.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Integrating Graph Rewrite Tools with Standard Tools2003In: International Conference on Graph Transformations in Industrial Applications (AGTIVE 03), Springer-Verlag , 2003Conference paper (Refereed)
• 198.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Proceedings of Software Composition (SC) -- Workshop at ETAPS 20042004In: Workshop at ETAPS 2004,2004, Spain: Electronic Transactions of Theoretical Computer Science ENTCS , 2004Conference paper (Refereed)
• 199.
Zuse Institute Berlin.
Zuse Institute Berlin. Zuse Institute Berlin. Oregon State University. Zuse Institue Berlin.
Automatic, Tensor--Guided Illustrative Vector Field Visualization2013Conference paper (Refereed)
• 200.
Zuse Institute Berlin.
Zuse Institue Berlin. Zuse Institute Berlin. Zuse Institue Berlin.
Glyph- and Texture-Based Visualization of Segmented Tensor Fields2012Conference paper (Refereed)
1234567 151 - 200 of 3077
Cite
Citation style
• apa
• ieee
• modern-language-association-8th-edition
• vancouver
• oxford
• Other style
More styles
Language
• de-DE
• en-GB
• en-US
• fi-FI
• nn-NO
• nn-NB
• sv-SE
• Other locale
More languages
Output format
• html
• text
• asciidoc
• rtf