liu.seSearch for publications in DiVA
Change search
Refine search result
45678910 301 - 350 of 3077
CiteExportLink to result list
Cite
Citation style
• apa
• ieee
• modern-language-association-8th-edition
• vancouver
• oxford
• Other style
More styles
Language
• de-DE
• en-GB
• en-US
• fi-FI
• nn-NO
• nn-NB
• sv-SE
• Other locale
More languages
Output format
• html
• text
• asciidoc
• rtf
Rows per page
• 5
• 10
• 20
• 50
• 100
• 250
Sort
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Issued (Newest first)
• Created (Oldest first)
• Created (Newest first)
• Last updated (Oldest first)
• Last updated (Newest first)
• Disputation date (earliest first)
• Disputation date (latest first)
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Issued (Newest first)
• Created (Oldest first)
• Created (Newest first)
• Last updated (Oldest first)
• Last updated (Newest first)
• Disputation date (earliest first)
• Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
• 301.
Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
Aalto University, Finland.
The Event Processing ODP2013In: Proceedings of the 4th Workshop on Ontology and Semantic Web Patterns co-located with 12th International Semantic Web Conference (ISWC 2013), CEUR-WS , 2013, Vol. 1188Conference paper (Refereed)

In this abstract we present a model for representing heterogeneous event objects in RDF, building on pre-existing work and focusing on structural aspects, which have not been addressed before, such as composite event objects encapsulating other event objects. The model extends the SSN and Event-F ontologies, and is available for download in the ODP portal.

• 302.
Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
Jönköping University, Department of Computer and Electrical Engineering. ISTC-CNR.
Ontology Testing - Methodology and Tool2012Conference paper (Other academic)

Ontology engineering is lacking methods for verifying that ontological requirements are actually fulfilled by an ontology. There is a need for practical and detailed methodologies and tools for carrying out testing procedures and storing data about a test case and its execution. In this paper we first describe a methodology for conducting ontology testing, as well as three examples of this methodology for testing specic types of requirements. Next, we describe a tool that practically supports the methodology.We conclude that there is a need to support users in this crucial part of ontology engineering, and that our proposed methodology is a step in this direction.

• 303.
Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
University of Sheffield, UK. University of Sheffield, UK. University of Sheffield, UK. University of Sheffield, UK.
Statistical Knowledge Patterns for Characterising Linked Data2013In: Proceedings of the 4th Workshop on Ontology and Semantic Web Patterns (WOP 2013)  co-located with 12th International Semantic Web Conference (ISWC 2013), CEUR-WS , 2013, Vol. 1188Conference paper (Refereed)

Knowledge Patterns (KPs), and even more specifically Ontology Design Patterns (ODPs), are no longer only generated in a top-down fashion, rather patterns are being extracted in a bottom-up fashion from online ontologies and data sources, such as Linked Data. These KPs can assist in tasks such as making sense of datasets and formulating queries over data, including performing query expansion to manage the diversity of properties used in datasets. This paper presents an extraction method for generating what we call Statistical Knowledge Patterns (SKPs) from Linked Data. SKPs describe and characterise classes from any reference ontology, by presenting their most frequent properties and property characteristics, all based on analysis of the underlying data. SKPs are stored as small OWL ontologies but can be continuously updated in a completely automated fashion. In the paper we exemplify this method by applying it to the classes of the DBpedia ontology, and in particular we evaluate our method for extracting range axioms from data. Results show that by setting appropriate thresholds, SKPs can be generated that cover (i.e. allow us to query, using the properties of the SKP) over 94% of the triples about individuals of that class, while only needing to care about 27% of the total number of distinct properties that are used in the data.

• 304.
Linköping University, Department of Computer and Information Science. Linköping University, Faculty of Science & Engineering.
DQ - Digitalt biljettsystem2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis

Syftet med denna rapport är att redovisa utvecklingen av ett digitalt biljettsystem som skulla användas av studentföreningar vid Linköpings universitet. Systemet utvecklas av studenter vid Linköpings universitet på uppdrag av individer representerande LinTek och StuFF, två kårer vid universitetet. Målet med projektet är att utveckla ett digitalt kösystem som studenter vid Linköpings universitet kan använda för att köpa biljetter till fester eller liknande. Resultatet av projektet är ett i många aspekter fungerande system som dock saknar vissa grundläggande aspekter. Utöver det utvecklade systemet har även denna rapport skrivits inklusive en individuell del per gruppmedlem som går in på djupet i olika områden relaterade till projektet.

kandidatarbete-digitalt-biljettsystem
• 305.
Linköping University, Department of Computer and Information Science, Software and Systems.
Designing and comparing access control systems2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Access control systems are an important concept in the area of computer security. In this master thesis different solutions are analyzed. The focus is on a tool called DW Access. DW Access is developed by Pdb Datasystem AB. A comparison was done that showed that DW Access is lacking some important functionality. After the comparison a base model for an access control system was designed. The new design includes concepts like relation- ships, replacements and time limited access. It also works for generic subjects and objects in the system. This design was later partly implemented in DW Access.

The conclusions from this thesis work is that DW Access is a unique tool and there is a market for the application or similar applications. The new functionality was one step forward and the evaluation showed that the potential users liked the new concepts. But it is a very open area because of very unique requirements on the market.

fulltext
• 306.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. University of Utah, USA.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. New York University, USA. University of Utah, USA. American Museum of Natural History, USA. American Museum of Natural History, USA. American Museum of Natural History, USA. New York University, USA. University of Utah, USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Center for Medical Image Science and Visualization (CMIV). University of Utah, USA.
OpenSpace: A System for Astrographics2020In: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 26, no 1, p. 633-642Article in journal (Refereed)

Human knowledge about the cosmos is rapidly increasing as instruments and simulations are generating new data supporting the formation of theory and understanding of the vastness and complexity of the universe. OpenSpace is a software system that takes on the mission of providing an integrated view of all these sources of data and supports interactive exploration of the known universe from the millimeter scale showing instruments on spacecrafts to billions of light years when visualizing the early universe. The ambition is to support research in astronomy and space exploration, science communication at museums and in planetariums as well as bringing exploratory astrographics to the class room. There is a multitude of challenges that need to be met in reaching this goal such as the data variety, multiple spatio-temporal scales, collaboration capabilities, etc. Furthermore, the system has to be flexible and modular to enable rapid prototyping and inclusion of new research results or space mission data and thereby shorten the time from discovery to dissemination. To support the different use cases the system has to be hardware agnostic and support a range of platforms and interaction paradigms. In this paper we describe how OpenSpace meets these challenges in an open source effort that is paving the path for the next generation of interactive astrographics.

OpenSpace: A System for Astrographics
• 307.
NYU, USA.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Amer Museum Nat Hist, NY 10024 USA. Community Coordinated Modeling Ctr, MD USA. Univ Utah, UT 84112 USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Center for Medical Image Science and Visualization (CMIV). Univ Utah, UT 84112 USA.
OpenSpace: Changing the Narrative of Public Dissemination in Astronomical Visualization from What to How2018In: IEEE Computer Graphics and Applications, ISSN 0272-1716, E-ISSN 1558-1756, Vol. 38, no 3, p. 44-57Article in journal (Refereed)

This article presents the development of open-source software called OpenSpace that bridges the gap between scientific discoveries and public dissemination and thus paves the way for the next generation of science communication and data exploration. The article describes how the platform enables interactive presentations of dynamic and time-varying processes by domain experts to the general public. The concepts are demonstrated through four cases: Image acquisitions of the New Horizons and Rosetta spacecraft, the dissemination of space weather phenomena, and the display of high-resolution planetary images. Each case has been presented at public events with great success. These cases highlight the details of data acquisition, rather than presenting the final results, showing the audience the value of supporting the efforts of the scientific discovery.

• 308.
NYU, USA.
Univ Utah, UT 84112 USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Center for Medical Image Science and Visualization (CMIV).
OpenSpace: Bringing NASA Missions to the Public2018In: IEEE Computer Graphics and Applications, ISSN 0272-1716, E-ISSN 1558-1756, Vol. 38, no 5, p. 112-118Article in journal (Refereed)

This viewpoint presents OpenSpace, an open-source astrovisualization software project designed to bridge the gap between scientific discoveries and their public dissemination. A wealth of data exists for space missions from NASA and other sources. OpenSpace brings together this data and combines it in a range of immersive settings. Through non-linear storytelling and guided exploration, interactive immersive experiences help the public to engage with advanced space mission data and models, and thus be better informed and educated about NASA missions, the solar system and outer space. We demonstrate this capability by exploring the OSIRIS-Rex mission.

fulltext
• 309.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, Artificial Intelligence and Integrated Computer Systems. Linköping University, The Institute of Technology. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
Supporting Urban Search & Rescue Mission Planning through Visualization-Based Analysis2014In: Proceedings of the Vision, Modeling, and Visualization Conference 2014, Eurographics - European Association for Computer Graphics, 2014Conference paper (Refereed)

We propose a visualization system for incident commanders in urban search~\&~rescue scenarios that supports access path planning for post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present a set of viable access paths, based on varying risk factors, in a 3D environment combined with the visual analysis tools enabling informed decisions and trade-offs. Based on these decisions, a responder is guided along the path by the incident commander, who can interactively annotate and reevaluate the acquired point cloud to react to the dynamics of the situation. We describe design considerations for our system, technical realizations, and discuss the results of an expert evaluation.

fulltext
• 310.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
St. Barbara Hospital, Hamm, Germany. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology. St. Barbara Hospital, Hamm, Germany. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
Guiding Deep Brain Stimulation Interventions by Fusing Multimodal Uncertainty Regions2013Conference paper (Other academic)

Deep Brain Stimulation (DBS) is a surgical intervention that is known to reduce or eliminate the symptoms of common movement disorders, such as Parkinson.s disease, dystonia, or tremor. During the intervention the surgeon places electrodes inside of the patient.s brain to stimulate speci.c regions. Since these regions span only a couple of millimeters, and electrode misplacement has severe consequences, reliable and accurate navigation is of great importance. Usually the surgeon relies on fused CT and MRI data sets, as well as direct feedback from the patient. More recently Microelectrode Recordings (MER), which support navigation by measuring the electric .eld of the patient.s brain, are also used. We propose a visualization system that fuses the different modalities: imaging data, MER and patient checks, as well as the related uncertainties, in an intuitive way to present placement-related information in a consistent view with the goal of supporting the surgeon in the .nal placement of the stimulating electrode. We will describe the design considerations for our system, the technical realization, present the outcome of the proposed system, and provide an evaluation.

paper
• 311.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
St. Barbara Hospital, Hamm, Germany. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. St. Barbara Hospital, Hamm, Germany. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
Supporting Deep Brain Stimulation Interventions by Fusing Microelectrode Recordings with Imaging Data2012Conference paper (Refereed)
fulltext
• 312.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Science and Technology. Linköping University, Faculty of Science & Engineering. American Museum of Natural History, New York, USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. American Museum of Natural History, New York, USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
OpenSpace: Public Dissemination of Space Mission Profiles2015In: 2015 IEEE Scientific Visualization Conference (SciVis): Proceedings / [ed] James Ahrens; Huamin Qu; Jos Roerdink, Institute of Electrical and Electronics Engineers (IEEE), 2015, p. 141-142Conference paper (Refereed)

This work presents a visualization system and its application to space missions. The system allows the public to disseminate the scientific findings of space craft and gain a greater understanding thereof. Instruments field-of-views and their measurements are embedded in an accurate 3 dimensional rendering of the solar system to provide context to past measurements or the planning of future events. We tested our system with NASAs New Horizons at the Pluto Pallooza event in New York and will expose it to the greater public on the upcoming July 14th Pluto flyby.

fulltext
• 313.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
NASA Goddard Space Flight Center, Greenbelt, MD, USA. NASA Goddard Space Flight Center, Greenbelt, MD, USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Center for Medical Image Science and Visualization (CMIV). Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
VCMass: A Framework for Verification of Coronal Mass Ejection Ensemble Simulations2014Conference paper (Refereed)

Supporting the growing field of space weather forecasting, we propose a framework to analyze ensemble simulations of coronal mass ejections. As the current simulation technique requires manual input, uncertainty is introduced into the simulation pipeline which leads to inaccurate predictions. Using our system, the analyst can compare ensemble members against ground truth data (arrival time and geo-effectivity) as well as information derived from satellite imagery. The simulations can be compared on a global basis, based on time-resolved quality measures, and as a 3D volumetric rendering with embedded satellite imagery in a multi-view setup. This flexible framework provides the expert with the tools to increase the knowledge about the, as of yet not fully understood, principles behind the formation of coronal mass ejections.

fulltext
fulltext
• 314.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
NASA Goddard Space Flight Center, USA. NASA Goddard Space Flight Center, USA. NASA Goddard Space Flight Center, USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Center for Medical Image Science and Visualization (CMIV). Ulm University, Germany.
Visual Verification of Space Weather Ensemble Simulations2015In: 2015 IEEE Scientific Visualization Conference (SciVis), IEEE, 2015, p. 17-24Conference paper (Refereed)

We propose a system to analyze and contextualize simulations of coronal mass ejections. As current simulation techniques require manual input, uncertainty is introduced into the simulation pipeline leading to inaccurate predictions that can be mitigated through ensemble simulations. We provide the space weather analyst with a multi-view system providing visualizations to: 1. compare ensemble members against ground truth measurements, 2. inspect time-dependent information derived from optical flow analysis of satellite images, and 3. combine satellite images with a volumetric rendering of the simulations. This three-tier workflow provides experts with tools to discover correlations between errors in predictions and simulation parameters, thus increasing knowledge about the evolution and propagation of coronal mass ejections that pose a danger to Earth and interplanetary travel

• 315.
Linköping University, Department of Science and Technology. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. iRobot, CA USA. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. Ulm University, Germany.
A Visualization-Based Analysis System for Urban Search & Rescue Mission Planning Support2017In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 36, no 6, p. 148-159Article in journal (Refereed)

We propose a visualization system for incident commanders (ICs) in urban searchandrescue scenarios that supports path planning in post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for the assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present the IC with a set of viable access paths, based on varying risk factors, in a 3D environment combined with visual analysis tools enabling informed decision making and trade-offs. Based on these decisions, a responder is guided along the path by the IC, who can interactively annotate and reevaluate the acquired point cloud and generated paths to react to the dynamics of the situation. We describe visualization design considerations for our system and decision support systems in general, technical realizations of the visualization components, and discuss the results of two qualitative expert evaluation; one online study with nine searchandrescue experts and an eye-tracking study in which four experts used the system on an application case.

fulltext
• 316.
Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
What is the relationship between task-based and open-ended usability testing, in terms of measuring satisfaction?2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Usability is one of the most important aspects of Information Technology. Usability plays a vital role in this industry, where organizations thrive to ensure utmost satisfaction of their end-users in regard to the experience of using their product. The systems may be a website or a software application. To measure user satisfaction, the method of usability testing can be performed. Performing usability testing gives a clear picture of difficulties that would be faced by potential target users. There are different types of usability testing such as Task-based usability testing, open ended usability testing, remote usability testing etc. The important point here is about deciding upon the most appropriate type of testing technique to get the accurate user satisfaction level.

This study is mainly focused to answer the following research question: What is the relationship between the task-based and open ended usability testing, in terms of measuring satisfaction? System Usability Scale (SUS) has been used to measure the satisfaction of the users in this study. For this we used two websites performing task-based usability testing and open ended usability testing respectively.

This study had involved twenty eight different participants. Participants are divided into two groups, one group to perform open ended usability testing and another for task-based usability testing for both the websites. This study has produced following results; Open-ended testing tended to produce higher SUS-ratings for the tested system. The results in this study showed that users performing open-ended usability testing gave positive responses for both the websites in terms of user satisfaction. Open-ended usability testing is an exploratory testing, where the testing is based on different aspects such as user interface of the system, design etc. Task-based usability testing is goal based where users have to complete the given task without fail. This method drew lower scores when compared to open-ended usability testing for the tested systems from the attained results. Nevertheless that task-based testing attained lower SUS scores, it is fairly straight forward than open-ended testing to measure efficiency and effectiveness. The above results have been discussed in detail. This study has finally concluded that to measure the usability of a system it is recommended to practice both the open-ended and task-based usability testing techniques.

What is the relationship between task-based and open ended usability testing, in terms of measuring satisfaction?
• 317.
Linköping University, Department of Computer and Information Science.
Verifikation av verktyget aspect analyzer2003Independent thesis Basic level (professional degree)Student thesis

Rising complexity in the development of real-time systems has made it crucial to have reusable components and a more flexible way of configuring these components into a coherent system. Aspect-oriented system development (AOSD) is a technique that allows one to put a system’s crosscutting concerns into"modules"that are called aspects. Applying AOSD in real-time and embedded system development one can expect reductions in the complexity of the system design and development.

A problem with AOSD in its current form is that it does not support predictability in the time domain. Hence, in order to use AOSD in real-time system development, we need to provide ways of analyzing temporal behavior of aspects, components and resulting system (made from weaving aspects and components). Aspect analyzer is a tool that computes the worst-case execution time (WCET) for a set of components and aspects, thus, enabling support for predictability in the time domain of aspect-oriented real-time software.

A limitation of the aspect analyzer, until now, were that no verification had been made whether the aspect analyzer would produce WCET values that were close to the measured or computed (with another WCET analysis technique) WCET of an aspect-oriented real-time system. Therefore, in this thesis we perform a verification of the correctness of the aspect analyzer using a number of different methods for WCET analysis. These investigations of the correctness of the output from the aspect analyzer gave confidence to the automated WCET analysis. In addition, performing this verification led to the identification of the steps necessary to compute the WCETs of a piece of program, when using a third party tool, which gives the ability to write accurate input files for the aspect analyzer.

FULLTEXT01
• 318. Bodirsky, Manuel
Linköping University, Department of Computer and Information Science, Software and Systems.
The Complexity of Phylogeny Constraint Satisfaction2016Conference paper (Refereed)
• 319.
Ecole Polytechnique, Palaiseau, France.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology. Max-Planck-Institute for Human Development, Berlin, Germany and University of Virginia, Charlottesville, USA..
Essential Convexity and Complexity of Semi-algebraic Constraints2012In: Logical Methods in Computer Science, ISSN 1860-5974, E-ISSN 1860-5974, Vol. 8, no 4Article in journal (Refereed)

Let \Gamma be a structure with a finite relational signature and a first-order definition in (R;*,+) with parameters from R, that is, a relational structure over the real numbers where all relations are semi-algebraic sets. In this article, we study the computational complexity of constraint satisfaction problem (CSP) for \Gamma: the problem to decide whether a given primitive positive sentence is true in \Gamma. We focus on those structures \Gamma that contain the relations \leq, {(x,y,z) | x+y=z} and {1}. Hence, all CSPs studied in this article are at least as expressive as the feasibility problem for linear programs. The central concept in our investigation is essential convexity: a relation S is essentially convex if for all a,b\inS, there are only finitely many points on the line segment between a and b that are not in S. If \Gamma contains a relation S that is not essentially convex and this is witnessed by rational points a,b, then we show that the CSP for \Gamma is NP-hard. Furthermore, we characterize essentially convex relations in logical terms. This different view may open up new ways for identifying tractable classes of semi-algebraic CSPs. For instance, we show that if \Gamma is a first-order expansion of (R;*,+), then the CSP for \Gamma can be solved in polynomial time if and only if all relations in \Gamma are essentially convex (unless P=NP).

fulltext
• 320.
Ecole Polytechnique, Palaiseau, France.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology. University of Virginia, USA.
Horn versus Full First-order: Complexity Dichotomies in Algebraic Constraint Satisfaction2012In: Journal of logic and computation (Print), ISSN 0955-792X, E-ISSN 1465-363X, Vol. 22, no 3, p. 643-660Article in journal (Refereed)

We study techniques for deciding the computational complexity of infinite-domain constraint satisfaction problems. For certain fundamental algebraic structures Delta, we prove definability dichotomy theorems of the following form: for every first-order expansion Gamma of Delta, either Gamma has a quantifier-free Horn definition in Delta, or there is an element d of Gamma such that all non-empty relations in Gamma contain a tuple of the form (d,...,d), or all relations with a first-order definition in Delta have a primitive positive definition in Gamma. The results imply that several families of constraint satisfaction problems exhibit a complexity dichotomy: the problems are in P or NP-hard, depending on the choice of the allowed relations. As concrete examples, we investigate fundamental algebraic constraint satisfaction problems. The first class consists of all first-order expansions of (Q;+). The second class is the affine variant of the first class. In both cases, we obtain full dichotomies by utilising our general methods.

• 321.
CNRS/LIX, École Polytechnique, 91128 Palaiseau, France.
Linköping University, Department of Computer and Information Science, TCSLAB - Theoretical Computer Science Laboratory. Linköping University, The Institute of Technology. Max-Planck-Institute for Human Development, Königin-Luise-Strasse 5, 14195, Berlin.
Semilinear Program Feasibility2009In: Automata, Languages and Programming, Berlin / Heidelberg: Springer , 2009, p. 79-90Conference paper (Refereed)

We study logical techniques for deciding the computational complexity of infinite-domain constraint satisfaction problems (CSPs). For the fundamental algebraic structure where are the real numbers and L 1,L 2,... is an enumeration of all linear relations with rational coefficients, we prove that a semilinear relation R (i.e., a relation that is first-order definable with linear inequalities) either has a quantifier-free Horn definition in Γ or the CSP for is NP-hard. The result implies a complexity dichotomy for all constraint languages that are first-order expansions of Γ: the corresponding CSPs are either in P or are NP-complete depending on the choice of allowed relations. We apply this result to two concrete examples (generalised linear programming and metric temporal reasoning) and obtain full complexity dichotomies in both cases.

• 322.
Ecole Polytechnique, Palaiseau, France.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
Equivalence Constraints2012Conference paper (Other academic)

The following result for finite structures Gamma has been conjectured to hold for all countably infinite omega-categorical structures Gamma: either the model-complete core Delta of Gamma has an expansion by finitely many constants such that the pseudovariety generated by its polymorphism algebra contains a two-element algebra all of whose operations are projections, or there is a homomorphism f from Delta^k to Delta, for some finite k, and an automorphism alpha of Delta satisfying f(x1,...,xk) = alpha(f(x2,...,xk,x1)). This conjecture has been confirmed for all infinite structures Gamma that have a first-order definition over (Q;<), and for all structures that are definable over the random graph. In this paper, we verify the conjecture for all structures that are definable over an equivalence relation with a countably infinite number of countably infinite classes. Our result implies a complexity dichotomy (into NP-complete and P) for a family of constraint satisfaction problems (CSPs) which we call equivalence constraint satisfaction problems. The classification for equivalence CSPs can also be seen as a first step towards a classification of the CSPs for all relational structures that are first-order definable over Allen's interval algebra, a well-known constraint calculus in temporal reasoning.

• 323.
Linköping University, Department of Computer and Information Science.
Handledning i casual-spel: Konsten att lära ut ett komplext spel på ett intuitivt sätt2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis

Spel i utbildande syfte blir allt vanligare för att lära barn nya koncept och aktivera deras hjärnor.För att lyckas fånga barnens uppmärksamhet och få dem att leva sig in i spelet krävs att man introducerar spelet på ett bra sätt.Så hur skapar man en intressant och lärande handledning för barn till ett relativt komplext pussel brädspel?Det är vad jag har undersökt i den här uppsatsen genom att göra en litteratur studie och sedan använda kunskapen för att skapa en handledning till spelet Entangled.Entangled är ett pussel brädspel som strävar mot att lära ut grunderna till kvantmekansik sammanflätning. För att ett spel ska bli bra krävs det att alla delar av spelet jobbar tillsammans på samma nivå. Vad jag menar är t.ex. spelets regler, spelmekanik och grafik. Alla delar måste fungera tillsammans för att det slutliga spelet ska bli en bra helhetsupplevelse.

Utöver det så krävs det att man förklarar spelets regler och spelmekanik och det görs vanligast av en handledning i spelet, som introducerar hur spelet fungerar.

Entangled är skapat i javascript med hjälp av ramverket Phaser.io för att lättare hantera grafik, animationer och andra objekt. Entangled utvecklades från början av andra studenter, på Linköpings Universitet, med ett annat namn.Under mitt examensarbete har Entangled ändrats för att öka spelupplevelsen och inlevelse genom att byta grafik och införa en handledning.För att utvärdera om ändringarna i spelet fungerar bra utfördes en undersökning i form av bevakade sessioner.I undersökningen deltog 10 personer i åldrarna 8-25år, där testpersonerna fick spela Entangled och sedan svara på två enkäter, IEQ och PANAS.

Försöker man skapa handledning, där målet är att få den nya spelaren att investera tid och skapa en koppling till spelet, är det viktigt att handledningen inte ger för mycket hjälp.Detta gör att den nya spelaren får tänka mer och blir mer involverad i spelet.Däremot om man ger den nya spelaren för lite information ger det motsatt effekt, balansen mellan vad som är för mycket och för lite kan minst sagt vara svår att fastställa.I Entangled tyckte testpersonerna att handledningen inte var tillräckligt förklarande, men de gav inte upp och fortsatte försöka tills de lyckades.Detta kan vara en anledning till varför resultatet ifrån undersökningen var förhållandevis positivt, testpersonerna blev mer involverade.

fulltext
• 324.
Linköping University, Department of Computer and Information Science.
Visualisering av elektroniska kopplingsscheman2009Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis

AnSyn AB är ett företag i Linköping som utvecklar programvara för att optimera analog elektronik. I deras program Analog Dimensions finns en visualiseringsmodul som ritar upp de kopplingsscheman som elektronikkonstruktören arbetar med. Ansyn var inte nöjda med den lösningen de hade. Den gamla visualiseringsmodulen hade flera begränsningar och i denna rapport kan du följa arbetet med att ta fram en ny visualiseringsmodul. Arbetet resulterade i en helt ny visualiseringsmodul som saknar de begränsningar som den gamla visualiseringsmodulen hade. Visualiseringsmodulen är helt skriven i Java och använder sig av ett grafikbibliotek vid namn Netbeans Visual Library. Det är ett bibliotek med öppen källkod som bland annat kan användas för visualisering av vektorgrafik. Rapporten innehåller även en undersökning av programbibliotek för Java som hanterar vektorgrafik. I undersökningen studerades totalt 15 olika bibliotek.

FULLTEXT01
• 325.
Linköping University, Department of Computer and Information Science, Software and Systems.
Converting an existing .NET Framework ground control software into a cross-platform application2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Unmanned aerial vehicles can be used in many different situations such as, for example, monitoring the growth of crops or for surveillance of a private property. Operating the unmanned aerial vehicle is usually done using some kind of ground control station. This thesis examines the possibilities of creating ground control stations working on several different platforms using the cross platform development frameworks Xamarin, Universal Windows Platform and Mono. This is done by creating and comparing three prototype applications regarding functional requirements, code reuse and resource usage. It is shown that none of the cross platform frameworks can fulfill all of the initial requirements on a ground control station. However, for the case studied in this thesis, Xamarin is demonstrated to be the most suitable cross platform framework of the three since it provides the same functionality as UWP for Windows devices while also enabling development for both Android and iOS devices.

fulltext
• 326. Bonatti, Piero
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems.
The REWERSE View on Policies2005In: Semantic Web and Policy Workshop,2005, Proceedings of the Semantic Web and Policy Workshop: UMBC eBiquity , 2005, p. 21-Conference paper (Refereed)
• 327.
Naples University.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Database and information techniques. University of Zurich. L3S Research Center. St. Gallen University. Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology.
Semantic Web Policies -- A Discussion of Requirements and Research Issues2006In: European Semantic Web Conference,2006, Springer: Springer , 2006, p. 712-724Conference paper (Refereed)

Policies are pervasive in web applications. They play crucial roles in enhancing security, privacy and usability of distributed services. There has been extensive research in the area, including the Semantic Web community, but several aspects still exist that prevent policy frameworks from widespread adoption and real world application. This paper discusses important requirements and open research issues in this context, focusing on policies in general and their integration into trust management frameworks, as well as on approaches to increase system cooperation, usability and user-awareness of policy issues.

• 328. Bonatti, Piero
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems.
An Integration of Reputation-based and Policy-based Trust Management2005In: Semantic Web and Policy Workshop,2005, Proceedings of the Semantic Web and Policy Workshop: UMBC eBiquity , 2005, p. 136-Conference paper (Refereed)
• 329.
Universit¿a di Napoli.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, IISLAB - Laboratory for Intelligent Information Systems. Hannover University. Hannover University. Universita degli Studi di Torino. Universita degli Studi di Torino. Universita degli Studi di Torino. Universita degli Studi di Torino. Universit¿a di Napoli . Institute of Computer Science, FORTH, Greece . University of St. Gallen, Switzerland . University of Zurich, Switzerland .
Rule-based Policy Specification: State of the Art and Future Work2004Report (Other academic)
• 330.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
Utveckling av kodeditor för kreativt skapande på webben2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis

Att skapa genom att koda är ett nytt och spännande sätt att gestalta konst, lära sig att koda eller bara ha roligt. Denna rapports syfte är att utforska hur en sådan upplevelse kan göras mer interaktiv och intuitiv för användaren samt hur sådan funktionalitet kan implementeras. Som bas för detta ligger en explorativ fallstudie av det arbete som projektgruppen har utfört för att förbättra hemsidan för CodeArt.

Det visar sig att ökad användarupplevelse kan uppnås med hjälp av implementation av olika verktyg och funktioner, i detta fall livekodning, versionshantering, hjälptexter och ett loggningsverktyg.

LIU-IDA/LITH-EX-G--14-038--SE, Datavetenskap vid LiTH 15 hp examensarbete, grundnivå
• 331.
Linköping University, Department of Computer and Information Science.
Deployment and analysis of DKIM with DNSSEC2008Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis

As the email system is widely used as a communication channel, and often is crucial for the performance of organizations, it is important that users can trust the content of what is being delivered to them. A standard called DomainKeys Identified Mail (DKIM) has been developed by the IETF to solve the problem with authentication and integrity, by using digital signatures. This master's thesis goal is to evaluate the solution where an implementation of DKIM is extended with DNSSEC validation. DNSSEC is a solution which secures, among other, the mapping between IP addresses and domain names. The implementation of DKIM is deployed and evaluated with function testing, domain testing, threat analysis, and interoperability testing.DKIM does not need any new public-key infrastructure, thus inflicting less cost on the deployment compared with other cryptographic solutions such as S/MIME and PGP. We recommended to use DKIM together with DNSSEC to secure the transportation of the DKIM public key. The upcoming standard ADSP can inform the recipient of whether a domain is signing its email or not and thereby a possibility to detect any unauthorized signature removal. A further problem is that mailing lists often manipulate the email, thus breaking the signature. We therefore recommend to send email directly to the recipient or active DKIM signing on the mailing lists.

FULLTEXT01
• 332.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
Horn clause logic with external procedures: towards a theoretical framework1989Licentiate thesis, monograph (Other academic)

Horn clause logic has certain properties which limit its usefulness as a programming language. In this thesis we concentrate on three such limitations: (1) Horn clause logic is not intended for the implementation of algorithms. Thus, if a problem has an efficient algorithmic solution it may be difficult to express this within the Horn clause formalism. (2) To work with a predefined structure like integer arithmetic, one has to axiomatize it by a Horn clause program. Thus functions of the structure are to be represented as predicates of the program. (3) Instead of re-implement existing software modules, it is clearly better to re-use them. To this end, a support for combining Horn clause logic with other programming languages is needed.

When extending the Horn clause formalism, there is always a trade-off between general applicability and purity of the resulting system. There have been many suggestions for solving some of problems (1) to (3). Most of them use one of the following strategies: (a) To allow new operational features, such as access to low-level constructs of other languages. (b) To introduce new language constructs, and to support them by a clean declarative semantics and a complete operational semantics.

In this thesis a solution to problems (1) to (3) is suggested. It combines the strategies of (a) and (b) by limiting their generality: We allow Horn clause programs to call procedures written in arbitrary languages. It is assumed however that these procedures are either functional or relational. The functional procedures yield a ground term as output whenever given ground terms as input. Similarly, the relational procedures either succeed or fail whenever applied to ground terms. Under these assumptions the resulting language has a clean declarative semantics.

For the operational semantics, an extended but incomplete unification algorithm, called S-unify is developed. By using properties of this algorithm we characterize classes of goals for which our interpreter is complete. It is also formally proved that (a slightly extended version of) S-unify is as complete as possible under the adopted assumptions.

• 333.
Linköping University, Department of Computer and Information Science, TCSLAB - Theoretical Computer Science Laboratory. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, TCSLAB - Theoretical Computer Science Laboratory. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
A Simple Fixed Point Characterization of Three-Valued Stable Model Semantics1990In: Information Processing Letters, ISSN 0020-0190, E-ISSN 1872-6119, Vol. 40, no 2, p. 73-78Article in journal (Refereed)
• 334.
Linköping University, Department of Computer and Information Science.
Kalman Filter with Adaptive Noise Models for Statistical Post-Processing of Weather Forecasts2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

We develop Kalman filter with adaptive noise models for statistical post-processing of 2-metre temperature forecasts for the purpose of reducing the systematic errors that numerical weather prediction models usually suffer. For this, we propose time-varying dynamic linear models for the system noise covariance matrix and the measurement noise covariance matrix, and we study how that affects the mean predictions of the underlying state and the observed data. Five Kalman filter models are introduced, a discrete Kalman filter model with the distinctive feature that the measurement (observation) at time t is the observed forecast error at that time, two Kalman filter with adaptive noise models where the measurement noise covariance matrix is time-varying, a Kalman filter model where the forecasts of the 10-metre wind components are included as explanatory variables, and a Kalman filter with heavy-tailed noise using the Student’s t-distribution under a Bayesian approach. Ten weather stations located in Sweden are selected trying to obtain a heterogeneous sample and six different forecasts issued are filtered with different sets of initial values.

The implementation of these methods has been done in Python and R.

• 335.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
SMT Aided Test Case Generation For Constrained Feature Models2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

With the development of highly configurable and large software, a new challenge has to be addressed, when it comes to software testing. While traditional testing approaches might still apply and succeed in achieving a better quality of service, the high degree of customizable parts of such a system implies the mentioned testing activities on different configurations. If a formal notion is used to express the allowed configurations of a system, one might think of generating such configurations in an automated fashion. However, if there are constraints involved, traditional model-based test-case generation might cause problems to achieve a desired coherency. An idea is, to use those constraints to generate test-cases and to achieve coherency at the same time. Satisfiability modulo theories (SMT) has been an emerging field in current theoretical computer science and developed decision procedures to treat various theoretical fragments in a specific manner. The goal of this thesis is, to look at a translation mechanism from an expression language for constraints into SAT modulo theories and involve this technique into a test-case generation process. Furthermore, the balance between the generation of coherent test-cases as well as the problem-specific purposes of such test-cases is investigated.

fulltext
• 336.
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Contributions to management and validation of non-functional requirements2004Licentiate thesis, monograph (Other academic)

Non-functional requirements (NFRs) are essential when considering software quality in that they shall represent the right quality of the intended software. It is generally hard to get hold of NFRs and to specify them in measurable terms, and most software development methods applied today focus on functional requirements (FRs). Moreover, NFRs are relatively unexplored in the literature and knowledge regarding real-world treatment of NFRs is particularly rare.

A case study and a literature survey were performed to provide this kind of knowledge, which also served as a problem inventory to outline future research activities. An interview series with practitioners at two large software development organizations was carried out. As a major result, it was established that too few NFRs are considered in development and that they are stated in vague terms. Moreover, it was observed that organizational power structures strongly influence the quality of the forthcoming software, and that processes need to be well suited for dealing with NFRs.

It was selected among several options to explore how processes can be better suited to handle NFRs by adding the information of actual feature use. A case study was performed in which the feature use of an interactive product management tool was measured indirectly from log files of an industrial user, and the approach was also applied to the problem of requirements selection. The results showed that the idea is feasible and that quality aspects can be effectively addressed by considering actual feature use.

An agenda for continued research comprises: further studies in system usage data acquisition, modelling of NFRs, and comparing means for predicting feasibility of NFRs. One strong candidate is weaving high-level requirement models with models of available components.

• 337. Buy this publication >>
Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory. Linköping University, The Institute of Technology.
Processes and Models for Capacity Requirements in Telecommunication Systems2009Doctoral thesis, comprehensive summary (Other academic)

Capacity is an essential quality factor in telecommunication systems. The ability to develop systems with the lowest cost per subscriber and transaction, that also meet the highest availability requirements and at the same time allow for scalability, is a true challenge for a telecommunication systems provider. This thesis describes a research collaboration between Linköping University and Ericsson AB aimed at improving the management, representation, and implementation of capacity requirements in large-scale software engineering.

An industrial case study on non-functional requirements in general was conducted to provide the explorative research background, and a richer understanding of identified difficulties was gained by dedicating subsequent investigations to capacity. A best practice inventory within Ericsson regarding the management of capacity requirements and their refinement into design and implementation was carried out. It revealed that capacity requirements crosscut most of the development process and the system lifecycle, thus widening the research context considerably. The interview series resulted in the specification of 19 capacity sub-processes; these were represented as a method plug-in to the OpenUP software development process in order to construct a coherent package of knowledge as well as to communicate the results. They also provide the basis of an empirically grounded anatomy which has been validated in a focus group. The anatomy enables the assessment and stepwise improvement of an organization’s ability to develop for capacity, thus keeping the initial cost low. Moreover, the notion of capacity is discussed and a pragmatic approach for how to support model-based, function-oriented development with capacity information by its annotation in UML models is presented. The results combine into a method for how to improve the treatment of capacity requirements in large-scale software systems.

1. The Bad Conscience of Requirements Engineering: An Investigation in Real-World Treatment of Non-Functional Requirements
Open this publication in new window or tab >>The Bad Conscience of Requirements Engineering: An Investigation in Real-World Treatment of Non-Functional Requirements
2003 (English)In: Third Conference on Software Engineering Research and Practice in Sweden (SERPS'03), Lund, 2003, p. 1-8Conference paper, Published paper (Refereed)
##### Abstract [en]

Even though non-functional requirements (NFRs) are critical in order to provide software of good quality, the literature of NFRs is relatively sparse. We describe how NFRs are treated in two development organizations, an Ericsson application center and the IT department of the Swedish Meteorological and Hydrological Institute. We have interviewed professionals about problems they face and their ideas on how to improve the situation. Both organizations are aware of NFRs and related problems but their main focus is on functional requirements,primarily because existing methods focus on these. The most tangible problems experienced are that many NFRs remain undiscovered and that NFRs are stated in non-measurable terms. It became clear that the size andstructure of the organization require proper distribution of employees’ interest, authority and competence of NFRs. We argue that a feasible solution might be to strengthen the position of architectural requirements, which are more likely to emphasize NFRs.

##### Keywords
Non-functional requirements, case study
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16790 (URN)
Available from: 2009-02-25 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
2. Good Practice and Improvement Model of Handling Capacity Requirements of Large Telecommunication Systems
Open this publication in new window or tab >>Good Practice and Improvement Model of Handling Capacity Requirements of Large Telecommunication Systems
2006 (English)In: 14th IEEE International Requirements Engineering Conference (RE'06), Minneapolis/S:t Paul, Los Alamitos, CA: IEEE Computer Society , 2006, p. 245-250Conference paper, Published paper (Refereed)
##### Abstract [en]

There is evidence to suggest that the software industry has not yet matured as regards management of nonfunctional requirements (NFRs). Consequently the cost of achieving required quality is unnecessarily high. To try and avoid this, the telecommunication systems provider Ericsson defined a research task to try and improve the management of requirements for capacity, which is one of the most critical NFRs. Linkoping University joined in the effort and conducted an interview series to investigate good practice within different parts of the company. Inspired by the interviews and an ongoing process improvement project a model for improvement was created and activities were synthesized. This paper contributes the results from the interview series, and details the subprocesses of specification that should be improved. Such improvements are about understanding the relationship between numerical entities at all system levels, augmenting UML specifications to make NFRs visible, working with time budgets, and testing the sub system level components on the same level as they are specified.

##### Place, publisher, year, edition, pages
Los Alamitos, CA: IEEE Computer Society, 2006
##### Keywords
Non-functional requirements, capacity, process improvement
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16791 (URN)10.1109/RE.2006.28 (DOI)0-7695-2555-5 (ISBN)978-0-7695-2555-6 (ISBN)
Available from: 2009-02-19 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
3. Integrating an Improvement Model of Handling Capacity Requirements with OpenUP/Basic Process
Open this publication in new window or tab >>Integrating an Improvement Model of Handling Capacity Requirements with OpenUP/Basic Process
2007 (English)In: 13th International working conference on Requirements Engineering: Foundations for Software Quality (REFSQ'07), Trondheim, Norway, Berlin Heidelberg: Springer , 2007, p. 341-354Conference paper, Published paper (Refereed)
##### Abstract [en]

Contemporary software processes and modeling languages have a strong focus on Functional Requirements (FRs), whereas information of Non-Functional Requirements (NFRs) are managed with text-based documentation and individual skills of the personnel. In order to get a better understanding of how capacity requirements are handled, we carried out an interview series with various branches of Ericsson. The analysis of this material revealed 18 Capacity Sub-Processes (CSPs) that need to be attended to create a capacity-oriented development. In this paper we describe all these sub-processes and their mapping into an extension of the OpenUP/Basic software process. Such an extension will support a process engineer in realizing the sub-processes, and has at the same time shown that there are no internal inconsistencies of the CSPs. The extension provides a context for continued research in using UML to support negotiation between requirements and existing design.

##### Place, publisher, year, edition, pages
Berlin Heidelberg: Springer, 2007
##### Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 4542
##### Keywords
Capacity requirements, OpenUP/Basic, method plug-in, Eclipse Process Framework, process improvement
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16792 (URN)10.1007/978-3-540-73031-6_26 (DOI)978-3-540-73030-9 (ISBN)
Available from: 2009-02-19 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
4. Extending the OpenUP/Basic Requirements Discipline to Specify Capacity Requirements
Open this publication in new window or tab >>Extending the OpenUP/Basic Requirements Discipline to Specify Capacity Requirements
2007 (English)In: Requirements Engineering Conference, 2007. RE '07, IEEE Computer Society, 2007, p. 328-333Conference paper, Published paper (Refereed)
##### Abstract [en]

Software processes, such as RUP and agile methods, focus their requirements engineering part on use cases and thus functional requirements. Complex products, such as radio network control software, need special handling of non-functional requirements as well. We describe how we used the eclipse process framework to augment the open and minimal OpenUP/basic process with improvements found in management of capacity requirements in a case-study at Ericsson. The result is compared with another project improving RUP to handle performance requirements. The major differences between the improvements are that 1) they suggest a special, dedicated performance manager role and we suggest that present roles are augmented, 2) they suggest a bottom-up approach to performance verification while we focus on system performance first, i.e. top-down. Further, we suggest augmenting UMLl-2 models with capacity attributes to improve information flow from requirements to implementation.

##### Place, publisher, year, edition, pages
IEEE Computer Society, 2007
##### Series
International Requirements Engineering Conference. Proceedings, ISSN 1090-705X
##### Keywords
Capacity requirements, process improvement, method plug-in, OpenUP/Basic, Eclipse Process Framework
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16797 (URN)10.1109/RE.2007.24 (DOI)000251576800040 ()978-0-7695-2935-6 (ISBN)
##### Conference
15th IEEE International Requirements Engineering Conference, 15-19 October 2007, Delhi, India
Available from: 2009-02-19 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
5. A Case Study in Assessing and Improving Capacity Using an Anatomy of Good Practice
Open this publication in new window or tab >>A Case Study in Assessing and Improving Capacity Using an Anatomy of Good Practice
2007 (English)In: The 6th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE 2007), Dubrovnik, Croatia, New York: ACM , 2007, p. 509-512Conference paper, Published paper (Refereed)
##### Abstract [en]

Capacity in telecommunication systems is highly related to operator revenue. As a vendor of such systems, Ericsson AB is continuously improving its processes for estimating, specifying, tuning, and testing the capacity of delivered systems. In order to systematize process improvements Ericsson AB and Linköping University joined forces to create an anatomy of Capacity Sub Processes (CSPs). The anatomy is the result of an interview series conducted to document good practices amongst organizations active in capacity improvement. In this paper we analyze four different development processes in terms of how far they have reached in their process maturity according to our anatomy and show possible improvement directions. Three of the processes are currently in use at Ericsson, and the fourth is the OpenUP/Basic process which we have used as a reference process in earlier research. We also include an analysis of the observed good practices. The result mainly confirms the order of CSPs in the anatomy, but we need to use our information of the maturity of products and the major life cycle in the organization in order to fully explain the role of the anatomy in planning of improvements.

##### Place, publisher, year, edition, pages
New York: ACM, 2007
##### Keywords
Capacity, non-functional requirements, process improvement
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16801 (URN)10.1145/1287624.1287697 (DOI)978-1-59593-811-4 (ISBN)
Available from: 2009-02-19 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
6. A Method for Improving the Treatment of Capacity Requirements in Large Telecommunication Systems
Open this publication in new window or tab >> A Method for Improving the Treatment of Capacity Requirements in Large Telecommunication Systems
##### Abstract [en]

Non-functional requirements crosscut functional models and are more difficult to enforce in system models. This paper describes a long-term research collaboration regarding capacity requirements between Linköping University and Ericsson AB. We describe an industrial case study on non-functional requirements as a background. Succeeding efforts dedicated to capacity include a detailed description of the term, a best practice inventory within Ericsson, and a pragmatic approach for how to annotate UML models with capacity information. The results are also represented as a method plug-in to the OpenUP software process and an anatomy facilitating the possibility to assess and improve an organization’s abilities to develop for capacity. The results combine into a method for how to improve the treatment of capacity requirements in large-scale software systems. Both product and process views are included, with emphasis on the latter.

##### Keywords
Non-functional requirements, capacity requirements, process improvement, anatomy, UML, OpenUP, Eclipse Process Framework
##### National Category
Software Engineering
##### Identifiers
urn:nbn:se:liu:diva-16805 (URN)
Available from: 2009-02-19 Created: 2009-02-19 Last updated: 2018-01-13Bibliographically approved
Processes and Models for Capacity Requirements in Telecommunication Systems
Cover
• 338.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Measuring the Use of Features in a Requirements Engineering Tool-An Industrial Case Study2004In: Fourth Conference on Software Engineering Researchand Practice in Sweden,2004, 2004, p. 101-Conference paper (Refereed)
• 339.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Supporting Requirements Selection by Measuring Feature Use2004In: Tenth International Workshop on Requirements Engineering: Foundation forSoftware Quality REFSQ04,2004, 2004Conference paper (Refereed)
• 340.
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Ericsson AB. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, PELAB - Programming Environment Laboratory.
Modelling Capacity Requirements in Large-Scale Telecommunication Systems2008In: Eighth Conference on Software Engineering Research and Practice in Sweden SERPS08,2008, 2008Conference paper (Refereed)

• 341.
Linköping University, Department of Computer and Information Science, Software and Systems.
Usability of a Business Software Solution for Financial Follow-up Information of Service Contracts2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Enterprise Resource Planning systems have been available since the 1990s and come with several business benefits for the users. One of the major advantages is improved decision making through current and accessible information about strategical, tactical and operational levels of the organization. Although several Enterprise Resource Planning system vendors provide several features for contract management, more decision support regarding the total profitability of service contracts is desired by the customers. Estimating the total profitability of service contracts is a challenging task for all service providers and implies a lot of manual data processing by the contract manager. This master’s thesis is conducted in collaboration with IFS World Operations AB and aims to investigate how functionality for budget and forecasting of the profitability of service contracts can be designed to be usable in terms of effectiveness. The implementation was performed iteratively and the resulting prototypes were evaluated and refined throughout the project. The final high-fidelity prototype for budgeting of service contracts was evaluated using the task success rate in conjunction with the System Usability Scale to assess how well the system conformed to the needs of the users. The study revealed that two of the key characteristics of financial follow-up information of service contracts is the support of creating a budget and graphical visualizations of both budgeted and actual values. The final usability evaluation indicated that the developed functionality was usable in terms of effectiveness and has an overall usability clearly above the average.

fulltext
• 342.
Technische Universität Darmstadt, Germany.
Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology. Technische Universität Darmstadt, Germany.
Analysis of Privacy-Enhancing ProtocolsBased on Anonymity Networks2012Conference paper (Refereed)

In this paper, we analyze privacy-enhancing protocolsfor Smart Grids that are based on anonymity networks. Theunderlying idea behind such protocols is attributing two distinctpartial identities for each consumer. One is used to send realtimeinformation about the power consumption, and the otherfor transmitting the billing information. Such protocols providesender-anonymity for the real-time information, while consolidateddata is sent for billing. In this work, the privacy propertiesof such protocols are analyzed, and their computational efficiencyis evaluated and compared using simulation to other solutionsbased on homomorphic encryption.

fulltext
• 343.
NICTA, Australia; University of New South Wales, Sydney, NSW, Australia.
NICTA, Alexandria, NSW, Australia . Linköping University, Department of Computer and Information Science, Database and information techniques. Linköping University, The Institute of Technology. University of Saskatchewan, Canada. NICTA, Alexandria, NSW, Australia .
The Untold Story of the Clones: Content-agnostic Factors that Impact YouTube Video Popularity2012In: Proc. ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) 2012, Association for Computing Machinery (ACM), 2012, p. 1186-1194Conference paper (Refereed)

Video dissemination through sites such as YouTube can have widespread impacts on opinions, thoughts, and cultures. Not all videos will reach the same popularity and have the same impact. Popularity differences arise not only because of differences in video content, but also because of other "content-agnostic" factors. The latter factors are of considerable interest but it has been difficult to accurately study them. For example, videos uploaded by users with large social networks may tend to be more popular because they tend to have more interesting content, not because social network size has a substantial direct impact on popularity.

In this paper, we develop and apply a methodology that is able to accurately assess, both qualitatively and quantitatively, the impacts of various content-agnostic factors on video popularity. When controlling for video content, we observe a strong linear "rich-get-richer" behavior, with the total number of previous views as the most important factor except for very young videos. The second most important factor is found to be video age. We analyze a number of phenomena that may contribute to rich-get-richer, including the first-mover advantage, and search bias towards popular videos. For young videos we find that factors other than the total number of previous views, such as uploader characteristics and number of keywords, become relatively more important. Our findings also confirm that inaccurate conclusions can be reached when not controlling for content.

• 344.
Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, Faculty of Science & Engineering.
Utveckling av ett kommunikationsprotokoll för datainsamling från medicinteknisk utrustning2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Patient data management systems (PDMS) have to handle various data sources such as patient monitors, ventilators and pumps. The objective of this work was to develop a general and maintainable medical protocol and a database system to manage patient data from various medical instruments in intensive care.

A prototype system with an experimental communication protocol and a database was created and evaluated technically with a Philips MP30 patient monitor system. The system was developed in the programming language C# with a MySQL database. The study shows that it is possible to create a protocol that handles various medical data sources. The report describes the developed protocol, data structures, architecture and related communication standards used in healthcare.

fulltext
• 345.
Linköping University, Department of Computer and Information Science, Human-Centered systems.
Implementation och användbarhet av en spelbutik i en applikation2016Independent thesis Basic level (university diploma), 10,5 credits / 16 HE creditsStudent thesis

Det här examensarbetet grundar sig i området datateknik och är utfört på Linköpings Universitet. Examensarbetet kretsar kring det matematiska pusselspelet Entanglement som är baserat på Christer Fuglesangs barnböcker om rymden. Spelet är fortfarande i utvecklingsstadiet. Tanken är att Entanglement skall vara en gratis applikation.  Därmed var behovet av att implementera någon typ av funktion som leder till avkastning. Under examensarbetets gång har en fallstudie och en kravframtagning utförts där andra redan befintliga avkastningsfunktioner har granskas i applikationer samt krav har tagits fram. Detta gav upphov till den virtuella butik som Entanglement nu använder sig av. Spelet är baserat på böckernas design och därav kommer även butiken använda sig utav material taget från böckerna. Då spelet är skrivit med hjälp av ramverket Phaser kommer även skapandet av butiken ske med hjälp av Phaser. Resultatet från examensarbetet kommer att vara en fungerande virtuell butik med design från Christer Fuglesangs böcker. Även så kommer användartester att utföras för att ligga till grund i bedömningen om butiken presterar utefter kundens önskemål och krav och för att se om spelet uppnår användbarhet.

fulltext
• 346.
Linköping University, Department of Computer and Information Science.
Penetrationstester: Offensiv säkerhetstestning2006Independent thesis Advanced level (degree of Magister), 20 points / 30 hpStudent thesis

Penetrationstester är ett sätt att utifrån ett angreppsperspektiv testa datasäkerhet. Denna rapport tar upp ett antal vanliga sårbarhetstyper att leta efter, och föreslår metoder för att testa dem. Den är skapad för att vara ett stöd för organisationer som vill starta en egen penetrationstestverksamhet. I rapporten ingår därför förutom sårbarhetstyper att leta efter, även de viktigaste typer av verktyg och kompetenser som behövs. Även förslag på administrativa rutiner och en grov skiss på utförandet finns med. I sista kapitlet finns sedan en diskussion om hur rapporten kan ligga till underlag för fortsatt arbete och hur penetrationstester använder sig av/relaterar till ett par närliggande områden såsom kryptering och intrångsdetektering. Det hela avslutas med en liten analys på när penetrationstester är användbara, vilket är när de antingen är helautomatiserade och bara kontrollerar ett fåtal sårbarheter, eller när kostnaden för ett intrång skulle bli väldigt stor.

FULLTEXT01
• 347.
Computer Science, Aalborg University, Denmark.
Computer Science, Aalborg University, Denmark. Computer Science, Aalborg University, Denmark. Computer Science, Aalborg University, Denmark. Computer Science, Aalborg University, Denmark. Computer Science, Aalborg University, Denmark. Computer Science, Aalborg University, Denmark.
Statistical and exact schedulability analysis of hierarchical scheduling systems2016In: Science of Computer Programming, ISSN 0167-6423, E-ISSN 1872-7964, Vol. 127, p. 103-130Article in journal (Refereed)

This paper contains two contributions: 1) A development methodology involving two techniques to enhance the resource utilization and 2) a new generic multi-core resource model for hierarchical scheduling systems.

As the first contribution, we propose a two-stage development methodology relying on the adjustment of timing attributes in the detailed models during the design stage. We use a lightweight method (statistical model checking) for design exploration, easily assuring high confidence in the correctness of the models. Once a satisfactory design has been found, it can be proved schedulable using the computation costly method (symbolic model checking). In order to analyze a hierarchical scheduling system compositionally, we introduce the notion of a stochastic supplier modeling the supply of resources from each component to its child components in the hierarchy. We specifically investigate two different techniques to widen the set of provably schedulable systems: 1) a new supplier model; 2) restricting the potential task offsets.

We also provide a way to estimate the minimum resource supply (budget) that a component is required to provide. In contrast to analytical methods, we prove non-schedulable cases via concrete counterexamples. By having richer and more detailed scheduling models this framework, has the potential to prove the schedulability of more systems.

As the second contribution, we introduce a generic resource model for multi-core hierarchical scheduling systems, and show how it can be instantiated for classical resource models: Periodic Resource Models (PRM) and Explicit Deadline Periodic (EDP) resource models. The generic multi-core resource model is presented in the context of a compositional model-based approach for schedulability analysis of hierarchical scheduling systems.

The multi-core framework presented in this paper is an extension of the single-core framework used for the analysis in the rest of the paper.

• 348.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
Schedulability and Memory Interference Analysis of Multicore Preemptive Real-time Systems2017In: ICPE '17 Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering, ACM Press, 2017, p. 263-274Conference paper (Refereed)

Today's embedded systems demand increasing computingpower to accommodate the ever-growing software functionality.Automotive and avionic systems aim to leverage thehigh performance capabilities of multicore platforms, but arefaced with challenges with respect to temporal predictability.Multicore designers have achieved much progress onimprovement of memory-dependent performance in cachingsystems and shared memories in general. However, havingapplications running simultaneously and requesting the accessto the shared memories concurrently leads to interference.The performance unpredictability resulting from interferenceat any shared memory level may lead to violationof the timing properties in safety-critical real-time systems.In this paper, we introduce a formal analysis framework forthe schedulability and memory interference of multicore systemswith shared caches and DRAM. We build a multicoresystem model with a ne grained application behavior givenin terms of periodic preemptible tasks, described with explicitread and write access numbers for shared caches andDRAM. We also provide a method to analyze and recommendcandidates for task-to-core reallocation with the goalto nd schedulable congurations if a given system is notschedulable. Our model-based framework is realized usingUppaal and has been used to analyze a case study

• 349.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
Instance-based ontology alignment using decision trees2012Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Using ontologies is a key technology in the semantic web. The semantic web helps people to store their data on the web, build vocabularies, and has written rules for handling these data and also helps the search engines to distinguish between the information they want to access in web easier. In order to use multiple ontologies created by different experts we need matchers to find the similar concepts in them to use it to merge these ontologies.

Text based searches use the string similarity functions to find the equivalent concepts inside ontologies using their names.This is the method that is used in lexical matchers. But a global standard for naming the concepts in different research area does not exist or has not been used. The same name may refer to different concepts while different names may describe the same concept.

To solve this problem we can use another approach for calculating the similarity value between concepts which is used in structural and constraint-based matchers. It uses relations between concepts, synonyms and other information that are stored in the ontologies. Another category for matchers is instance-based that uses additional information like documents related to the concepts of ontologies, the corpus, to calculate the similarity value for the concepts.

Decision trees in the area of data mining are used for different kind of classification for different purposes. Using decision trees in an instance-based matcher is the main concept of this thesis. The results of this implemented matcher using the C4.5 algorithm are discussed. The matcher is also compared to other matchers. It also is used for combination with other matchers to get a better result.

InstanceBasedOntologyAlignmentUsingDecisionTrees
• 350.
Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
Dependency-based groundness analysis of functional logic programs1993Licentiate thesis, monograph (Other academic)

The object of study in this thesis is a class of functional logic programs, where the functions are implemented in an external functional or imperative language. The contributions are twofold:

Firstly, an operational semantics is formally defined. The key idea is that non-ground function calls selected for unification are delayed and retained in form of constraints until their arguments become ground. With this strategy two problems arise: (1) Given a program P and an initial goal, will any delayed unifications remain unresolved after computation? (2) For every function call f(X) in P, find a safe evaluation point for f(X), i.e. a point in P where X always will be bound to a ground term, and thus f(X) can be evaluated.

Secondly, we present a static groundness analysis technique which enables us to solve problems (1) and (2) in a uniform way. The analysis method is dependency-based, exploiting analogies between logic programs and attribute grammars.

45678910 301 - 350 of 3077
CiteExportLink to result list
Cite
Citation style
• apa
• ieee
• modern-language-association-8th-edition
• vancouver
• oxford
• Other style
More styles
Language
• de-DE
• en-GB
• en-US
• fi-FI
• nn-NO
• nn-NB
• sv-SE
• Other locale
More languages
Output format
• html
• text
• asciidoc
• rtf