Using Domain-specific Corpora for Improved Handling of Ambiguity in RequirementsEzzini, Saad ; Abualhaija, Sallam ; et alin In Proceedings of the 43rd International Conference on Software Engineering (ICSE'21), Madrid 25-28 May 2021 (2021, May) Ambiguity in natural-language requirements is a pervasive issue that has been studied by the requirements engineering community for more than two decades. A fully manual approach for addressing ambiguity ... [more ▼] Ambiguity in natural-language requirements is a pervasive issue that has been studied by the requirements engineering community for more than two decades. A fully manual approach for addressing ambiguity in requirements is tedious and time-consuming, and may further overlook unacknowledged ambiguity – the situation where different stakeholders perceive a requirement as unambiguous but, in reality, interpret the requirement differently. In this paper, we propose an automated approach that uses natural language processing for handling ambiguity in requirements. Our approach is based on the automatic generation of a domain-specific corpus from Wikipedia. Integrating domain knowledge, as we show in our evaluation, leads to a significant positive improvement in the accuracy of ambiguity detection and interpretation. We scope our work to coordination ambiguity (CA) and prepositional-phrase attachment ambiguity (PAA) because of the prevalence of these types of ambiguity in natural-language requirements [1]. We evaluate our approach on 20 industrial requirements documents. These documents collectively contain more than 5000 requirements from seven distinct application domains. Over this dataset, our approach detects CA and PAA with an average precision of 80% and an average recall of 89% ( 90% for cases of unacknowledged ambiguity). The automatic interpretations that our approach yields have an average accuracy of 85%. Compared to baselines that use generic corpora, our approach, which uses domain-specific corpora, has 33% better accuracy in ambiguity detection and 16% better accuracy in interpretation. [less ▲] Detailed reference viewed: 113 (8 UL) Using environmental predictive settlement choice models as input data for settlement pattern simulationsSikk, Kaarel ![]() Scientific Conference (2020) Inductive models of archaeological site locations have been successfully used for predicting archaeological potential of places in landscapes. These models are mostly based on currently observable ... [more ▼] Inductive models of archaeological site locations have been successfully used for predicting archaeological potential of places in landscapes. These models are mostly based on currently observable environmental information. To reduce environmental determinism and increase both explanatory and predictive power several variables like visibility of locations have been interpreted as social factors of settlement locations. In the current paper we explore the possibilities of using inductive environmental models as input to simulation models. Although similar to models created for predictive purposes they need to be designed with different considerations. We present a study where we use inductive models of archaeological site locations to describe the spatial configuration of space environmentally suitable for residence. To do so we develop a conceptual agent-based model of residential choice based on discrete choice theory and theories of residential choice used in multiple fields from archaeology to contemporary urban studies. We discuss the role of environmental influences as perceived in archaeological data and how they relate to social influences and historical processes leading to emergence of settlement patterns. We argue that spatial structures of the inductive models of specific settlement patterns can inform us about the causal processes behind them when experimented with agent-based simulations. We present case study using inductive models of settlement locations from different periods of the Stone Age of Estonia. The differences of inductive settlement choice models and the ways of comparing them are discussed. The spatial configurations of the models of economic modes have different structures. For example region where settlements of water connected hunter-gatherers can be found has a different spatial structure than that of early agrarian communities. Those differences give insights on socioeconomic histories and can be used in explaining settlement pattern formation processes. [less ▲] Detailed reference viewed: 40 (0 UL)![]() Using Epidemic Hoarding to Minimize Load Delays in Distributed Virtual Environments; Botev, Jean ; Esch, Markus et alin Proceedings of the 4th International Conference on Collaborative Computing: Networking, Applications and Worksharing (CollaborateCom) (2008) Detailed reference viewed: 27 (2 UL)![]() Using FileMaker Pro to get the most from your corpus dataDeroey, Katrien ![]() Scientific Conference (2014, June 21) This presentation provides a basic introduction to the database programme FileMaker Pro. I will use examples from my research for which I used Corpus Query Language in Sketch Engine to retrieve importance ... [more ▼] This presentation provides a basic introduction to the database programme FileMaker Pro. I will use examples from my research for which I used Corpus Query Language in Sketch Engine to retrieve importance markers from BASE lectures which I then stored and annotated with FileMaker Pro. Although this programme is mainly used by businesses and so probably less familiar to corpus researchers than, for example, Access, it offers many features which greatly facilitate and speed up the processing of corpus data for research or materials development. Corpus concordances can be imported into a FileMaker database, where you can give them multiple tags and quickly and easily generate quantified instances from your corpus using any tag or a combination of tags. For example, the programme allowed me to classify concordances of importance markers into lexicogrammatical patterns, interactive and textual orientation types, component parts (e.g. verbs, Subjects), discipline, study level, co-occurring discourse markers etc. In this way, it took only a few seconds to generate and quantify instances of importance markers which, for instance, have the pattern ‘V clause’, contain ‘remember’ and co-occur with the discourse marker ‘but’. The programme thus allows you to examine and quantify the same data in a variety of ways and to retrieve only those instances you are interested in. This has considerable potential for facilitating the retrieval of corpus evidence for materials design and research. [less ▲] Detailed reference viewed: 97 (4 UL) Using forum theatre in organised youth soccer to positively influence antisocial and prosocial behaviour: a pilot study.; Biesta, Gert ; et alin Journal of Moral Education (2010), 39(1), 65-78 Detailed reference viewed: 102 (0 UL) Using Game Theory to configure P2P SIPBecker, Sheila ; State, Radu ; Engel, Thomas ![]() in Lecture Notes in Computer Science (2009) Detailed reference viewed: 122 (5 UL) Using Gamification and Metaphor to Design a Mobility Platform for CommutersMcCall, Roderick ; Koenig, Vincent ; Kracheel, Martin ![]() in International Journal of Mobile Human Computer Interaction (2012) In this paper the authors explain the use of gamification as a way to optimize mobility patterns within a heav- ily congested European City. They explore this from two perspectives, first by outlining a ... [more ▼] In this paper the authors explain the use of gamification as a way to optimize mobility patterns within a heav- ily congested European City. They explore this from two perspectives, first by outlining a gaming concept and secondly by explaining how the use of a mobility game that took place in two locations can be used to explore incentives and design issues. [less ▲] Detailed reference viewed: 391 (28 UL) Using global team science to identify genetic parkinson's disease worldwideKrüger, Rejko ; ; et alin Annals of Neurology (2019) Large multicenter approaches are necessary to systematically and uniformly characterize patients with genetic neurologic conditions and to eventually establish sizable clinical trial-ready cohorts. Detailed reference viewed: 127 (6 UL) Using GPS and absolute gravity observations to separate the effects of present-day and Pleistocene ice-mass changes in South East Greenlandvan Dam, Tonie ; Francis, Olivier ; et alin Earth and Planetary Science Letters (2017), 459 Measurements of vertical crustal uplift from bedrock sites around the edge of the Greenland ice sheet (GrIS) can be used to constrain present day mass loss. Interpreting any observed crustal displacement ... [more ▼] Measurements of vertical crustal uplift from bedrock sites around the edge of the Greenland ice sheet (GrIS) can be used to constrain present day mass loss. Interpreting any observed crustal displacement around the GrIS in terms of present day changes in ice is complicated, however, by the glacial isostatic adjustment (GIA) signal. With GPS observations alone, it is impossible to separate the uplift driven by present day mass changes from that due to ice mass changes in the past. Wahr et al. (1995) demonstrated that viscoelastic surface displacements were related to the viscoelastic gravity changes through a proportionality constant that is nearly independent of the choice of Earth viscosity or ice history model. Thus, by making measurements of both gravity and surface motion at a bedrock site, the viscoelastic effects could be removed from the observations and we would be able to constrain present day ice mass changes. Alternatively, we could use the same observations of surface displacements and gravity to determine the GIA signal. In this paper, we extend the theory of Wahr et al. (1995) by introducing a constant, Z, that represents the ratio between the elastic changes in gravity and elastic uplift at a particular site due to present day mass changes. Further, we combine 20 yrs of GPS observations of uplift with eight absolute gravity observations over the same period to determine the GIA signal near Kulusuk, a site on the southeastern side of the GrIS, to experimentally demonstrate the theory. We estimate that the GIA signal in the region is 4.49 ± 1.44 mm/yr and is inconsistent with most previously reported model predictions that demonstrate that the GIA signal here is negative. However, as there is very little in situ data to constrain the GIA rate in this part of Greenland, the Earth model or the ice history reconstructions could be inaccurate (Khan et al., 2016). Improving the estimate of GIA in this region of Greenland will allow us to better determine the present day changes in ice mass in the region, e.g. from GRACE. [less ▲] Detailed reference viewed: 350 (38 UL) Using GPS and Gravity to Infer Ice Mass Changes in Greenlandvan Dam, Tonie ; ; et alin EOS (2000), 81(37), 421-427 Detailed reference viewed: 137 (3 UL) Using graph theory to analyze biological networks; ; et al in BioData Mining (2011), 4(10), 1-27 Understanding complex systems often requires a bottom-up analysis towards a systems biology approach. The need to investigate a system, not only as individual components but as a whole, emerges. This can ... [more ▼] Understanding complex systems often requires a bottom-up analysis towards a systems biology approach. The need to investigate a system, not only as individual components but as a whole, emerges. This can be done by examining the elementary constituents individually and then how these are connected. The myriad components of a system and their interactions are best characterized as networks and they are mainly represented as graphs where thousands of nodes are connected with thousands of vertices. In this article we demonstrate approaches, models and methods from the graph theory universe and we discuss ways in which they can be used to reveal hidden properties and features of a network. This network profiling combined with knowledge extraction will help us to better understand the biological significance of the system. [less ▲] Detailed reference viewed: 126 (2 UL) Using Heterogeneous Multilevel Swarms of UAVs and High-Level Data Fusion to Support Situation Management in Surveillance ScenariosBouvry, Pascal ; ; Danoy, Grégoire et alin 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2016 (2016, September 19) The development and usage of Unmanned Aerial Vehicles (UAVs) quickly increased in the last decades, mainly for military purposes. This technology is also now of high interest in non-military contexts like ... [more ▼] The development and usage of Unmanned Aerial Vehicles (UAVs) quickly increased in the last decades, mainly for military purposes. This technology is also now of high interest in non-military contexts like logistics, environmental studies and different areas of civil protection. While the technology for operating a single UAV is rather mature, additional efforts are still necessary for using UAVs in fleets (or swarms). The Aid to SItuation Management based on MUltimodal, MUltiUAVs, MUltilevel acquisition Techniques (ASIMUT) project which is supported by the European Defence Agency (EDA) aims at investigating and demonstrating dedicated surveillance services based on fleets of UAVs. The aim is to enhance the situation awareness of an operator and to decrease his workload by providing support for the detection of threats based on multi-sensor multi-source data fusion. The operator is also supported by the combination of information delivered by the heterogeneous swarms of UAVs and by additional information extracted from intelligence databases. As a result, a distributed surveillance system increasing detection, high-level data fusion capabilities and UAV autonomy is proposed. [less ▲] Detailed reference viewed: 236 (16 UL) Using Hidden Markov Models and Rule-based Sensor Mediation on Wearable eHealth DevicesNeyens, Gilles ; Zampunieris, Denis ![]() in Procedings of the 11th International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies, Barcelona, Spain 12-16 November 2017 (2017) Improvements in sensor miniaturization allow wearable devices to provide more functionality while also being more comfortable for users to wear. The Samsung Simband©, for example, has 6 different sensors ... [more ▼] Improvements in sensor miniaturization allow wearable devices to provide more functionality while also being more comfortable for users to wear. The Samsung Simband©, for example, has 6 different sensors Electrocardiogram (ECG), Photoplethysmogram (PPG), Galvanic Skin Response (GSR), Bio-Impedance (Bio-Z), Accelerometer and a thermometer as well as a modular sensor hub to easily add additional ones. This increased number of sensors for wearable devices opens new possibilities for a more precise monitoring of patients by integrating the data from the different sensors. This integration can be influenced by failing or malfunctioning sensors and noise. In this paper, we propose an approach that uses Hidden Markov Models (HMM) in combination with a rule-based engine to mediate among the different sensors’ data in order to allow the eHealth system to compute a diagnosis on the basis of the selected reliable sensors. We also show some preliminary results about the accuracy of the first stage of the proposed model. [less ▲] Detailed reference viewed: 132 (10 UL) Using High-Content Screening to Generate Single-Cell Gene-Corrected Patient-Derived iPS Clones Reveals Excess Alpha-Synuclein with Familial Parkinson's Disease Point Mutation A30P.Barbuti, Peter ; Antony, Paul ; Rodrigues Santos, Bruno et alin Cells (2020), 9(9), The generation of isogenic induced pluripotent stem cell (iPSC) lines using CRISPR-Cas9 technology is a technically challenging, time-consuming process with variable efficiency. Here we use fluorescence ... [more ▼] The generation of isogenic induced pluripotent stem cell (iPSC) lines using CRISPR-Cas9 technology is a technically challenging, time-consuming process with variable efficiency. Here we use fluorescence-activated cell sorting (FACS) to sort biallelic CRISPR-Cas9 edited single-cell iPSC clones into high-throughput 96-well microtiter plates. We used high-content screening (HCS) technology and generated an in-house developed algorithm to select the correctly edited isogenic clones for continued expansion and validation. In our model we have gene-corrected the iPSCs of a Parkinson's disease (PD) patient carrying the autosomal dominantly inherited heterozygous c.88G>C mutation in the SNCA gene, which leads to the pathogenic p.A30P form of the alpha-synuclein protein. Undertaking a PCR restriction-digest mediated clonal selection strategy prior to sequencing, we were able to post-sort validate each isogenic clone using a quadruple screening strategy prior to generating footprint-free isogenic iPSC lines, retaining a normal molecular karyotype, pluripotency and three germ-layer differentiation potential. Directed differentiation into midbrain dopaminergic neurons revealed that SNCA expression is reduced in the gene-corrected clones, which was validated by a reduction at the alpha-synuclein protein level. The generation of single-cell isogenic clones facilitates new insights in the role of alpha-synuclein in PD and furthermore is applicable across patient-derived disease models. [less ▲] Detailed reference viewed: 107 (3 UL) Using higher-order adjoints to accelerate the solution of UQ problems with random fieldsHale, Jack ; Hauseux, Paul ; Bordas, Stéphane ![]() Poster (2018, January 08) A powerful Monte Carlo variance reduction technique introduced in Cao and Zhang 2004 uses local derivatives to accelerate Monte Carlo estimation. This work aims to: develop a new derivative-driven ... [more ▼] A powerful Monte Carlo variance reduction technique introduced in Cao and Zhang 2004 uses local derivatives to accelerate Monte Carlo estimation. This work aims to: develop a new derivative-driven estimator that works for SPDEs with uncertain data modelled as Gaussian random fields with Matérn covariance functions (infinite/high-dimensional problems) (Lindgren, Rue, and Lindström, 2011), use second-order derivative (Hessian) information for improved variance reduction over our approach in (Hauseux, Hale, and Bordas, 2017), demonstrate a software framework using FEniCS (Logg and Wells, 2010), dolfin-adjoint (Farrell et al., 2013) and PETSc (Balay et al., 2016) for automatic acceleration of MC estimation for a wide variety of PDEs on HPC architectures. [less ▲] Detailed reference viewed: 192 (27 UL) Using Hypothetical Vacancies in Factorial Surveys to Study Employers' Hiring Decisions – A Valid Approach?Gutfleisch, Tamara Rebecca ; Samuel, Robin ![]() Presentation (2019, July 18) Factorial survey experiments are increasingly employed by scholars interested in understanding the general mechanisms underlying employers' hiring decisions in relation to specific applicant ... [more ▼] Factorial survey experiments are increasingly employed by scholars interested in understanding the general mechanisms underlying employers' hiring decisions in relation to specific applicant characteristics. Usually, a sample of human resource professionals is asked to rate the hiring chances of hypothetical applicants for a hypothetical job. However, using hypothetical job descriptions for the evaluation of applicants in factorial surveys may reduce the internal and external validity of the results. For example, employers might apply different evaluation standards when assessing the quality of applicant profiles for a hypothetical job (put less/more weight on certain characteristics) because it is difficult to put themselves in the actual hiring situation – affecting the internal validity. In this paper, we contextualize prior factorial survey experiments by examining whether there is a difference in employers' hiring intentions when confronted with real versus hypothetical hiring problems. Despite the growing number of factorial surveys and the potential implications for the validity of these data, this question has been widely neglected so far. We employ a factorial survey experiment among recruiters in different occupational sectors in Luxembourg. Recruiters evaluate the hiring chances of several profiles of hypothetical applicants with varying characteristics either referring to a real vacancy in their company or to a hypothetical (but similar) job type. Preliminary findings suggest no differences in employers hiring decisions based on the type of evaluation used in the factorial survey. The results partly contradict previous findings from pretest data which showed significant differences between the average hiring chances in the two groups. By examining the internal validity of presenting hypothetical vacancies, this study contributes to methodological research on factorial surveys as well as to the literature studying employers' hiring decisions. [less ▲] Detailed reference viewed: 110 (13 UL) Using IEEE802.15.4e TSCH in an LLN context: Overview, Problem Statement and Goals; Palattella, Maria Rita ; in IETF draft, draft-watteyne-6tisch-tsch-00 (2013) Detailed reference viewed: 262 (0 UL) Using interactive items in arithmetic problems to help Grade 3 students shift from intuitive to arithmetic-based strategies and create mental representation.; Martin, Romain ; Kreis, Yves ![]() Poster (2016, July 29) In 2014, 32% of Grade 3 students in fundamental schools in Luxembourg failed to attain the minimum required skill level in mathematics; rising from 30% of students in 2015, as measured by the Ep.Stan ... [more ▼] In 2014, 32% of Grade 3 students in fundamental schools in Luxembourg failed to attain the minimum required skill level in mathematics; rising from 30% of students in 2015, as measured by the Ep.Stan examination, a standardized assessment of students at national level. These results have been rather stable since 2011, suggesting that almost 1 in 3 students in grade 3 do not possess the mathematical skills they would need to successfully progress in school. Most students in this bottom tier of performance in mathematics are also found to have low scores in reading skills in the German language (these students also tend to be recent arrivals with a low socio-economic profile) (Martin et al 2014), which as we will see has a compounding effect on their mathematics performance. At the beginning of grade 3 in the fundamental schools in Luxembourg, students begin to delve into the skills needed to solve arithmetic wording problems. That students encounter more barriers to perform highly in the resolution of arithmetic wording problems than in those problems presented in a numeric form is however a well-known fact (Reusser 1990). The needed skills are not only mathematical, but well-developed skills in reading the language are needed, to solve an arithmetic wording problem (LeBlanc & Weber-Russel 1996). Both conditions do not allow the low performing students, who also perform less well in Ep.Stan, to succeed. The purpose of this PhD study will be to measure the impact on the test results from Ep. Stan of the grade 3 students by letting students learn on wording problems that require intuitive strategies at first, up to those needing a more arithmetic strategy through interactive animated items in the digital learning environment MathemaTIC. [less ▲] Detailed reference viewed: 77 (1 UL)![]() Using iterative design and development for mobile learning systems in school projectsMelzer, André ; ; et alin Proceedings of ICEC CELDA 2007 (2007) Detailed reference viewed: 53 (0 UL) Using judge biographies together with the CJEU's archivesFritz, Vera ![]() Presentation (2020, February 20) Detailed reference viewed: 93 (2 UL) |
||