2024-03-29T08:12:13Z
https://www.duo.uio.no/oai/request
oai:www.duo.uio.no:10852/34910
2014-12-26T05:04:17Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2012
Innlogging har vist seg å være en stor barriere for brukere av informasjonssystemer. Dette er spesielt et problem for personer med nedsatt sensorisk, motorisk eller kognitiv funksjon. I verste fall kan disse personene oppleve å bli stengt ute fra det digitale samfunnet og sentrale digitale tjenester som bank, handel, offentlige tjenester og sosial kontakt.
Diskriminerings- og tilgjengelighetsloven setter krav til at IKT-tjenester skal ha en universell utforming og være tilgjengelig for flest mulig, uten ekstra tilpasning. Foreløpig mangler de offentlige tilsynsmyndighetene en forskrift til loven som kan fortelle hvilke krav loven setter til disse tjenestene. Det finnes lite forskning på universell utforming av autentiseringsløsninger, og i eksisterende retningslinjer for universell utforming finnes det ingen ingen direkte krav til autentiseringsløsninger.
I denne oppgaven undersøkes rammevilkårene for å utvikle tilgjengelige autentiseringsløsninger. Det undersøkes og foreslås retningslinjer til slike løsninger, og til slutt vurderes det hvordan nærfeltskommunikasjon kan brukes for å skape en universelt utformet autentiseringsløsning.
Noen sentrale funn er at man må se på hele brukerens belastning og ikke bare vurdere enkeltsituasjoner. I dag ligger kostnaden ved autentiseringsløsningene på brukerne. Gjennom krav til å håndtere passord, brukernavn og kontoer utnytter tjenesteleverandørene brukerens hukommelse som om det var en uregulert allmenning. Passord og brukernavn står ikke foran en snarlig død. Derfor må systemet håndtere og være kompatibelt med eksisterende løsninger. Tilgjengelige løsninger må være stabile og ikke kreve at brukeren stadig lærer seg nye metaforer, designkonvensjoner og terminologi. Kompleksiteten med å innføre stedet du er som en autentiseringsfaktor og håndtering av mange ulike kontoer på en enhet kan føre til at systemet blir for usikkert og umulig for brukeren å konfigurere på en sikker måte.
Bertelsen, Ola Njå. Universell utforming. Masteroppgave, University of Oslo, 2012
http://hdl.handle.net/10852/34910
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Bertelsen, Ola Njå&rft.title=Universell utforming&rft.inst=University of Oslo&rft.date=2012&rft.degree=Masteroppgave
URN:NBN:no-33621
171201
Fulltext https://www.duo.uio.no/bitstream/handle/10852/34910/1/Bertelsen-Master.pdf
Universell utforming : Inkluderende og tilgjengelige autentiseringsløsninger
oai:www.duo.uio.no:10852/34912
2014-12-26T05:04:17Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2012
Hash functions are important cryptographic primitives which map arbitrarily long messages to fixed-length message digests in such a way that: (1) it is easy to compute the message digest given a message, while (2) inverting the hashing process (e.g. finding a message that maps to a specific message digest) is hard. One attack against a hash function is an algorithm that nevertheless manages to invert the hashing process. Hash functions are used in e.g. authentication, digital signatures, and key exchange. A popular hash function used in many practical application scenarios is the Secure Hash Algorithm (SHA-1).
In this thesis we investigate the current state of the art in carrying out preimage attacks against SHA-1 using SAT solvers, and we attempt to find out if there is any room for improvement in either the encoding or the solving processes.
We run a series of experiments using SAT solvers on encodings of reduced-difficulty versions of SHA-1. Each experiment tests one aspect of the encoding or solving process, such as e.g. determining whether there exists an optimal restart interval or determining which branching heuristic leads to the best average solving time. An important part of our work is to use statistically sound methods, i.e. hypothesis tests which take sample size and variation into account.
Our most important result is a new encoding of 32-bit modular addition which significantly reduces the time it takes the SAT solver to find a solution compared to previously known encodings. Other results include the fact that reducing the absolute size of the search space by fixing bits of the message up to a certain point actually results in an instance that is harder for the SAT solver to solve. We have also identified some slight improvements to the parameters used by the heuristics of the solver MiniSat; for example, contrary to assertions made in the literature, we find that using longer restart intervals improves the running time of the solver.
Nossum, Vegard. SAT-based preimage attacks on SHA-1. Masteroppgave, University of Oslo, 2012
http://hdl.handle.net/10852/34912
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Nossum, Vegard&rft.title=SAT-based preimage attacks on SHA-1&rft.inst=University of Oslo&rft.date=2012&rft.degree=Masteroppgave
URN:NBN:no-33622
171922
Fulltext https://www.duo.uio.no/bitstream/handle/10852/34912/1/thesis-output.pdf
SAT-based preimage attacks on SHA-1
oai:www.duo.uio.no:10852/8701
2017-12-07T13:07:29Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2009
Conditional inference plays a central role in logical and Bayesian reasoning, and is used in a wide range of applications. It basically consists of expressing conditional relationship between parent and child propositions, and then to combine those conditionals with evidence about the parent propositions in order to infer conclusions about the child propositions. While conditional reasoning is a well established part of classical binary logic and probability calculus, its extension to belief theory has only recently been proposed. Subjective opinions represent a special type of general belief functions. This article focuses on conditional reasoning in subjective logic where beliefs are represented in the form of binomial or multinomial subjective opinions. Binomial conditional reasoning operators for subjective logic have been defined in previous contributions. We extend this approach to multinomial opinions, thereby making it possible to represent conditional and evidence opinions on frames of arbitrary size. This makes subjective logic a powerful tool for conditional reasoning in situations involving ignorance and partial information, and makes it possible to analyse Bayesian network models with uncertain probabilities.
Jøsang, Audun. Conditional Reasoning with Subjective Logic. Journal of Multiple-Valued Logic and Soft Computing
http://hdl.handle.net/10852/8701
358852
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Journal of Multiple-Valued Logic and Soft Computing&rft.volume=15&rft.spage=5
Journal of Multiple-Valued Logic and Soft Computing
15
1
5
38
URN:NBN:no-21422
88621
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8701/1/88621_josang.pdf
Conditional Reasoning with Subjective Logic
oai:www.duo.uio.no:10852/8700
2017-12-07T13:07:29Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
People who are seeking medical advice and care often find it difficult to obtain reliable information about the quality and competence of health service providers. While transparent quality evaluation of products and services is commonplace in most commercial services, public access to information about the quality of health services is usually very restricted. Online reputation and rating systems represent an emerging trend in decision support for service consumers. Reputation systems are based on collecting information about other parties in order to derive measures of their trustworthiness or reliability on various aspects. More specifically these systems use the Internet for the collection of ratings and for dissemination of derived reputation scores. Online rating systems applied to the health sector are already emerging. This article describes robust principles for implementing online reputation systems in the health sector. In order to prevent uncontrolled ratings, our method ensures that only genuine consumers of a specific service can rate that service. The advantage of using online reputation systems in the health sector is that it can assist consumers when deciding which health services to use, and that it gives an incentive for high quality health services among health service providers.
Jøsang, Audun. Online Reputation Systems for the Health Sector. Electronic Journal of Health Informatics
http://hdl.handle.net/10852/8700
358856
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Electronic Journal of Health Informatics&rft.volume=3
Electronic Journal of Health Informatics
3
1
URN:NBN:no-21167
88620
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8700/1/88620_josang.pdf
Online Reputation Systems for the Health Sector
oai:www.duo.uio.no:10852/8704
2015-02-13T05:02:18Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8704
238099
-
URN:NBN:no-23898
98497
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8704/1/JLC2008-IFIPTM.pdf
Continuous Ratings in Discrete Bayesian Reputation Systems
oai:www.duo.uio.no:10852/8703
2015-02-13T05:01:31Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8703
238116
-
URN:NBN:no-21755
88623
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8703/1/AAJM2008-AISC.pdf
An Exprimental Investigation of the Usability of Transaction Authorization in Online Bank Security Systems
oai:www.duo.uio.no:10852/8702
2015-02-13T05:01:33Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8702
238115
-
URN:NBN:no-21754
88622
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8702/1/JA2008-AISC.pdf
Robust WYSIWYS: A Method for Ensuring that What You See Is What You Sign
oai:www.duo.uio.no:10852/8696
2015-02-13T05:01:33Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8696
238101
-
URN:NBN:no-21751
88616
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8696/1/JBXC2008-TrustBus.pdf
Combining Trust and Reputation Management for Web-Based Services
oai:www.duo.uio.no:10852/8699
2015-02-13T05:00:13Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8699
238106
-
URN:NBN:no-21753
88619
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8699/1/Jos2008b-IPMU.pdf
Abductive Reasoning with Uncertainty
oai:www.duo.uio.no:10852/8698
2015-02-13T05:01:34Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8698
238102
-
URN:NBN:no-21752
88618
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8698/1/Jos2008a-IPMU.pdf
Cumulative and Averaging Unfusion of Beliefs
oai:www.duo.uio.no:10852/8697
2017-12-07T13:06:52Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
http://hdl.handle.net/10852/8697
238104
179
184
10.1109/SECURWARE.2008.64
URN:NBN:no-21221
88617
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8697/1/88617_josang.pdf
Optimal Trust Network Analysis with Subjective Logic
oai:www.duo.uio.no:10852/9976
2014-12-26T05:00:25Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
I den senere tid har mobile tjenester blitt innført som en støtte for trafikantene ved planlegging av sine kollektivreiser. Men av ulike grunner har ikke oppslutningen hos trafikantene vært spesielt stor for de mobile tjenestene. Den lave oppslutningen gir kollektivtransporten påskudd for å vurdere hvordan mobile tjenester kan tilpasses befolkningen, på en måte som motiverer trafikantene til videre bruk.
I den sammenheng vil jeg undersøke hvilke problemer trafikantene opplever ved dagens informasjonsformidling, slik at de tradisjonelle problemene ikke blir overført til nye mobile løsninger. Videre vil jeg undersøke hvordan informasjon i forbindelse med kollektivtransport kan presenteres med utgangspunkt i befolkningen, da befolkningen representerer kollektivtransportens brukere.
En rød tråd i dette arbeidet er hvordan man med mobile enheter kan berike trafikantene med informasjon fra omgivelsene, og hvordan informasjon kan tilpasses trafikantene i det offentlige rom.
Gartmann, Gjermund. IKT på stasjonen. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9976
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Gartmann, Gjermund&rft.title=IKT på stasjonen&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-20202
84071
091927684
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9976/1/gartmann.pdf
IKT på stasjonen : En studie av hvordan informasjonskanaler brukes ved trafikk-knutepunkter
oai:www.duo.uio.no:10852/34174
2014-12-26T05:12:00Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2012
TCP tries to send as quickly as possible, yet react to congestion in the network by reducing its send rate when a packet is dropped. This
function, however, only concerns one out of the two directions that TCP is operating on: from the data sender to the data receiver. The receiver acknowledges data packets, and if these acknowledgements (ACKs) create congestion along the backward path, this is not normally noticed and reacted upon.
This thesis presents the first real-life implementation
of RFC 5690 – an experimental mechanism that addresses this problem by performing “ACK congestion control”. Validations of this Linux kernel implementation against previously published simulations show that the mechanism performs correctly; various tests indicate under which conditions it is useful but also identify a few shortcomings in the original specification.
Olsen, Marius Næss. ACK Congestion Control. Masteroppgave, University of Oslo, 2012
http://hdl.handle.net/10852/34174
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Olsen, Marius Næss&rft.title=ACK Congestion Control&rft.inst=University of Oslo&rft.date=2012&rft.degree=Masteroppgave
URN:NBN:no-32856
168019
Fulltext https://www.duo.uio.no/bitstream/handle/10852/34174/1/thesis-mariusno.pdf
ACK Congestion Control
oai:www.duo.uio.no:10852/9285
2014-12-26T05:11:58Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
Rewriting logic can be used to prototype systems for automated
deduction. In this paper, we illustrate how this approach allows
experiments with deduction strategies in a flexible and conceptually
satisfying way.
This is achieved by exploiting the reflective property of rewriting
logic. By specifying a theorem prover in this way one quickly
obtains a readable, reliable and reasonably efficient system which
can be used both as a platform for tactic experiments and as a basis
for an optimized implementation. The approach is illustrated by
specifying a calculus for the connection method in rewriting logic
which clearly separates rules from tactics.
Holen, Bjarne. A Reflective Theorem Prover for the Connection Calculus. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9285
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Holen, Bjarne&rft.title=A Reflective Theorem Prover for the Connection Calculus&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10624
28063
05136932x
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9285/1/thesis2.pdf
A Reflective Theorem Prover for the Connection Calculus
oai:www.duo.uio.no:10852/9360
2017-12-07T13:05:04Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
1997
Myhre, Øystein. Brukermedvirkning i konsulentvirksomhet. Hovedoppgave, University of Oslo, 1997
http://hdl.handle.net/10852/9360
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Myhre, Øystein&rft.title=Brukermedvirkning i konsulentvirksomhet&rft.inst=University of Oslo&rft.date=1997&rft.degree=Hovedoppgave
URN:NBN:no-2480
3282
020546106
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9360/1/OMyhre.pdf
Brukermedvirkning i konsulentvirksomhet : hva er brukernes rolle?
oai:www.duo.uio.no:10852/9054
2013-03-12T07:57:20Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
Sentralt i hovedfagsoppgaven skrevet for Statskonsult står LivsIT (Livssituasjonsbasert IT-system). LivsIT er enkelt sagt et publikumsperspektiv på offentlig informasjon og tjenester med en målsetning om en samordnet e-forvaltning. LivsITs utgangspunkt er å gjøre det mulig å samle og presentere offentlig informasjon og tjenester på brukerens premisser, uavhengig av hvordan det offentlige er organisert. Forvaltningen skal fremstå helhetlig overfor brukerne, dvs. være samordnet mellom ulike sektorer og forvaltningsnivåer via et felles brukergrensesnitt. LivsIT som system presenterer en alternativ inngang til offentlig informasjon som ofte har vært presentert etats- eller sektorvis og med utgangspunkt i forvaltningens oppbyggning.
I hovedoppgaven identifiserer jeg et sett med designkriterier for LivsIT-systemet med utgangspunkt i Jakob Nielsens 10 gyldne regler og HCI teorier. En sentral del av prinsippene baserer seg på resultater av brukertesting på allerede eksisterende prototyper. I oppgaven ønsker jeg å identifisere og samle krav til grensesnitt fra stat, kommune og publikum og med bakgrunn i dette designe et forslag til webgrensesnitt. Forslaget består av en prototype som hovedsakelig fokuserer på navigasjon på nettstedet. Min problemstilling er derfor: Hvilke designprinsipper er det viktig å følge i design av et LivsIT-system for presentasjon på web, slik at de politiske målsetningene om en brukerfokusert døgnåpen forvaltning vil gi brukerne av systemet brukskvalitet.
Med teorien som rammeverk tar jeg utgangspunkt i de allerede eksisterende LivsIT-systemene og evaluerer disse med brukertesting. Metoden jeg benytter er høyttenkning (thinking aloud). Min metode ligger innenfor den brukerorienterte hovedretningen, og siden evalueringen skjer parallelt med utviklingen av prototypene, kan evalueringen teoretisk sett ha innvirkning på den videre utviklingen av prosjektet. I tilegg til brukertesting (Usability testing) foretar jeg en heuristisk evaluering (Predictive evaluation) av de eksisterende LivsIT-prototypene.
Bjerke, Dag Thorer. Design av grensesnitt for døgnåpen forvaltning. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/9054
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Bjerke, Dag Thorer&rft.title=Design av grensesnitt for døgnåpen forvaltning&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-7445
15355
031899161
Design av grensesnitt for døgnåpen forvaltning
oai:www.duo.uio.no:10852/9286
2014-12-26T05:11:59Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
This thesis provides contributions to the research programming language Creol (Concurrent REflective Ob ject-oriented Language). The first contribution is the EBNF grammar for Creol. The second contribution suggests how to extend the Creol language with functional constructs. The third and ma jor contribution is the design of a type system for the Creol language, as well as some molding of the Creol language, such that static type safety is achieved. The fourth contribution is a prototype implementation of a compiler for Creol.
The Creol language has until now provided static type safety and separation between inheritance and subtyping by assumption only. The creation of the Creol type system investigates this assumption for the Creol language. During the process there has also been a clarification of the Creol language from a type system point of view. The type system designed for Creol is a hybrid between a structural and nominal type system, and is a step towards a novel hybrid type
system, that facilitates a separation between inheritance and subtyping, while enforcing nominal constraints, when desireable.
The prototype compiler implemented for Creol is crafted with tools that operate on a higher level than traditional compiler tools. These high level approaches include combinator parsing and attribute grammars.
Fjeld, Jørgen Hermanrud. Compiling Creol Safely. Hovedoppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9286
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Fjeld, Jørgen Hermanrud&rft.title=Compiling Creol Safely&rft.inst=University of Oslo&rft.date=2005&rft.degree=Hovedoppgave
URN:NBN:no-10625
28067
051369370
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9286/1/Thesis_on_CreolCompiler.pdf
Compiling Creol Safely
oai:www.duo.uio.no:10852/9318
2017-12-07T13:05:04Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
In this thesis we propose a model-based approach to support data integration between heterogeneous enterprise systems. It reviews literature about interoperability, and presents several aspects of data integration problems. Further, it intends to give the reader an understanding of model-driven development which offers different standards for modeling and model transformation. The work of this thesis presents difficulties encountered in data integration by analysing problem examples. Based on the analysis, data integration problems are defined. We examine technologies related to interoperability, data integration and mapping. In addition, we present existing solution approaches to deal with the problem examples. The main goal is to specify how to develop tools for solving data integration problems by describing and realizing mapping between models. The technique which is specified to realize the mapping is presented in our proposed solution, which we have called the MODI Framework.
Khan, Mohammad Asaf, Mahmood, Khudija, . MODI framework - A model-based approach to data integration. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9318
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Khan, Mohammad Asaf&rft.au=Mahmood, Khudija&rft.title=MODI framework - A model-based approach to data integration&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10966
29022
051545861
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9318/2/thesisFinal.pdf
MODI framework - A model-based approach to data integration
oai:www.duo.uio.no:10852/9870
2017-12-07T13:05:05Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
A 'new' approach to community building is based on the concept of salutogenesis (a proactive approach to health promotion and prevention). Increasing organisational focus on sustaining healthy work forces requires a coherent mechanism for coping, social cohesion and community development. This research is based on a ten-month ethnographic study of social workers, health professionals and technologists of a Norwegian NGO involved in community health promotion. The aim was to develop a well-formed understanding of the three salutogenic criteria in terms of community-building processes. It was found that collaborating, planning (organising), and defining the community were the key areas of salutogenic community building. Based on a processual world view of context and action (change), the adaptation of a coherent conceptual framework for modelling practices allowed the identification of generic salutogenic practices in community building at a fundamental level: a non-compositional, non-substance semantico-ontological framework (semantic holism and process ontology).
Viravong, Khamphira. Salutogenic community building. International Journal for Web-based Communities. 2007, 3, 32
http://hdl.handle.net/10852/9870
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=International Journal for Web-based Communities&rft.volume=3&rft.spage=32&rft.date=2007
URN:NBN:no-18856
71893
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9870/2/wbc06_review_1.pdf
Salutogenic community building
oai:www.duo.uio.no:10852/9619
2014-12-26T05:12:00Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
The aim of this thesis is the specification and development of a new UML virtual machine – UMLexe- capable of executing platform independent system specifications.
For executing models, computational completeness is required and UMLexe propose a subset of UML and operational semantics for executing those models. UMLexe will provide prototype functionality to prove the concept of executing components combined with interaction models.
The first part of the thesis describes a case scenario illuminating the model notation. After a more detailed look at the specification and implementation, this case is executed to prove the concept. The last part of the thesis is dedicated to the specification and development of the UMLexe virtual machine and the evaluation of the implementation in terms of defined requirements and existing solutions executing UML models.
Fredriksen, Kai. UMLexe – UML virtual machine. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9619
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Fredriksen, Kai&rft.title=UMLexe – UML virtual machine&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-14309
52429
070175594
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9619/1/UMLexe-masterthesis.pdf
UMLexe – UML virtual machine : a framework for model execution
oai:www.duo.uio.no:10852/9273
2017-12-07T13:05:05Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
In this thesis we revisit a handful of well-known experiments, using modern tools, to see if results yielded from earlier experiments are valid for today's heterogeneous networks. The traffic properties we look at are relevant for designing and optimizing network equipment, such as routers and switches, and when building corporate networks. We have looked at the characteristics of two different heterogeneous networks; a university network, and an ISP network. We have captured traffic from different weeks, and at different times of the day. We first describe the challenges involved with collecting, processing and analyzing traffic traces from high-speed networks. Then we then look at the various factors that contribute to uncertainty in such measurements, and we try to deduct these factors. The experiments involve collection and analysis of high-resolution traffic traces from two operative networks, each of which contains several gigabytes of network traffic data. We look at properties such as: Packet inter-arrival time distributions, packet size distributions, modeling packet arrivals (self-similarity versus Poisson), traffic per application (egress traffic per destination port), and protocol distributions. A simplistic attempt to quantify the volume of Peer-to-Peer (P2P) traffic inspecting both header data and payload is conducted to evaluate the efficiency of today's methodology for identification (port numbers only). We have used freely available tools like TCPDump, Ethereal, TEthereal, Ntop, and especially the CAIDA CoralReef suite. The shortcomings of these tools for particular tasks have been compensated for by writing custom-made Perl scripts, proving that it is possible to do advanced analysis with fairly simple means. Our results reveal that there are in fact measurable differences in terms of packet inter-arrival time distributions and statistical properties in the two networks. We also find significant differences in the application distribution, and the deployment of new technologies such as Multicast.
Thorkildssen, Håvard Wik. Passive Traffic Characterization and Analysis in Heterogeneous IP Networks. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9273
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Thorkildssen, Håvard Wik&rft.title=Passive Traffic Characterization and Analysis in Heterogeneous IP Networks&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10614
27581
051368021
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9273/1/Passivemeasurements-1.0.pdf
Passive Traffic Characterization and Analysis in Heterogeneous IP Networks
oai:www.duo.uio.no:10852/9297
2013-03-12T07:57:26Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
Denne oppgaven tar for seg tillit til elektronisk valg, og hvorvidt det er mulig å lage elektroniske valgsystem som velgerne kan ha tillit til. For at en velger skal kunne ha tillit til et elektronisk valgsystem, må han føle seg trygg på at stemmen hans registreres og telles korrekt, samt at han er anonym under hele stemmegivningen. En av forutsetningene for at dette skal skje, er at systemet er minst like sikkert, og at anonymiteten ivaretas minst like godt, som i dagens manuelle system.
Denne oppgaven vurderer ulike måter problemene kan løses på, sett både med tanke på velgerens subjektive oppfatning av sikkerhet og med tanke på reell sikkerhet. Konklusjonen er at dette kan la seg løse på en tilfredsstillende måte.
Fridtun, Dag. Tillit til elektronisk valg. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9297
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Fridtun, Dag&rft.title=Tillit til elektronisk valg&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10638
28264
051156342
Tillit til elektronisk valg
oai:www.duo.uio.no:10852/9272
2014-12-26T05:12:02Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
For many reasons, large and small installations of computers can benefit from Automated
Configuration Management tools. All the processes from installation, configuration, to maintenance
and updating the computers can benefit from automation for the following reasons.
* Consistency across all the machines.
* Timeliness in maintenance and updates
* Simplify the process through the use declarative instructions.
Meanwhile in software configuration management, they are examining the problems of
identifying, controlling, monitoring and verifying changes in software development projects. To
complicate matters, some of the reasons for software configuration management
* Consistency in the source code.
* Timeliness in updates to the project members so that they have what is needed.
* A need to simplify documentation and development of complex projects.
Subsequently, the purpose of this thesis is to understand how concepts from Software
Configuration Management can aid the development of the field of System Configuration. To
achieve this purpose, this thesis will start with an examination of the similiarities between SCM
and System configuration. This will be followed by an examination of different key concepts in
System Configuration and the following three different tools that have taken different approaches
to the problem.
* Cfengine
* ISconf
* LSconf
With an understanding of how System Configuration and SCM are similar and an
understanding of many of the major concepts in System Configuration, the next step is to examine
some of the difference between the two fields. From there, it should be possible to see how some
concepts from SCM could be applied to System Configuration. It should also be possible to
examine concepts from System Configuration that could be applied to SCM.
Tam, Weng Seng. A potpourri of system configuration concepts. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9272
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tam, Weng Seng&rft.title=A potpourri of system configuration concepts&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10612
27580
051163977
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9272/1/wengxseng-thesis.pdf
A potpourri of system configuration concepts
oai:www.duo.uio.no:10852/9524
2017-12-07T13:05:05Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
ABSTRACT
This master thesis focuses on the use of NFC payment in Ubiquitous Computing context. NFC payment and the possibilities that emerge from this technology have been described. A case study has been conducted on the use of NFC payment with two user groups with different backgrounds. Knowledge from both of the previous mentioned work has been used to discuss how NFC payment appears as visible or invisible for the users.
The problem statements are:
- Describe the possibilities that emerge with NFC as a payment method.
- Conduct a user study of NFC as a payment method.
- Discuss how NFC payment appears as visible or invisible technology for the users in the user study.
Twelve different users participated in this study. The focus has been on the NFC phone that can bee used for payments. The NFC phone was used in the user study which was specially designed to observe the users experience and reactions when the NFC phone shifted between visible and invisible context. The results from the study are presented in this thesis.
The theoretical framework has been Ubiquitous Computing and related theories. Main concepts have been invisibility vs. visibility, center and periphery of attention and routine invisibility to mention some.
The study showed that the users approached and related differently towards NFC payment technology. This was evident through the different reactions and experiences the users expressed through surveys and interviews. It was not possible to sum up with a concluding remark on how NFC payment appears as visible or invisible technology for the users in the user study.
Khan, Ummear. Contactless Payment with Near Field Communication. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9524
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Khan, Ummear&rft.title=Contactless Payment with Near Field Communication&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12779
43486
061302848
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9524/2/Khan.pdf
Contactless Payment with Near Field Communication : An Empirical Study in Ubiquitous Computing Context
oai:www.duo.uio.no:10852/9298
2014-12-26T05:03:27Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
The Election Markup Language (EML) is a communication language used
between and within different subsystems of a computerized electoral
information system. EML is defined by means of a set of 33 XML
Schemas.
This thesis tests the hypothesis that the EML communication language
is suitable for a computerized Norwegian Electoral System. The testing
is performed using a prototype implementation in Java. Though the
implementation does not take into consideration security and anonymity
concerns, it is a full implementation of the Electoral System. The
prototype system consists of five subsystems that communicate using a
network connection. The implementation spans 10115 lines of code and
58 classes.
EML is found to be very close to a communication language suitable for
the Norwegian Electoral System, though a few changes would have to be
made to the standard to express the information exchange required by
law. The shortcomings in EML are countered with proposed changes in the standard, and in addition a some parts of the Norwegian Election Law and Election regulations are proposed changed.
Aas, Patricia S. M. Rincon G.. Evaluating the suitability of EML 4.0 for the Norwegian Electoral System. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9298
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Aas, Patricia S. M. Rincon G.&rft.title=Evaluating the suitability of EML 4.0 for the Norwegian Electoral System&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10639
28266
051369818
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9298/1/Documentation.pdf
Evaluating the suitability of EML 4.0 for the Norwegian Electoral System : A prototype approach
oai:www.duo.uio.no:10852/9299
2014-12-26T05:00:32Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
This thesis describes the design and implementation of JavaSplitter, a
prototype incremental proof search engine based on a variable splitting sequent calculus. The prover also includes modes for variable pure derivations, and for variable sharing derivations without splitting.
The splitting calculus uses an index system to achieve variable
sharing derivations, and to keep track of how variables are split into
different branches of a derivation. A graph representation of the
indices occurring in a skeleton and operations on this graph are used
to determine when splitting of such variables is sound.
The design and implementation of the data structures and operations
necessary for the proof search procedures are described. Further, the
three modes of proof search are compared with regard to number of
steps used to reach a proof for a set of valid input sequents.
Ekern, Karianne. JavaSplitter. A Java Implementation of Variable Splitting Proof Search. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9299
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Ekern, Karianne&rft.title=JavaSplitter. A Java Implementation of Variable Splitting Proof Search&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10641
28267
051369842
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9299/1/ThesisBib.pdf
JavaSplitter. A Java Implementation of Variable Splitting Proof Search
oai:www.duo.uio.no:10852/9731
2017-12-07T13:05:05Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
This study addresses the current problems of utilisation of health management information systems in developing countries due to the critical shortage of qualified and motivated human resources. The aims and objectives of this study are to (1) understanding local HMIS related practices in order to understand if health worker motivation affects or is affected by these practices, (2) understand how motivational theory applies and can be adjusted to cater for computing work of supportive nature, and finally (3) to offer suggestions for practice that might improve motivation towards HMIS responsibilities.
The study employed qualitative research methods in an interpretive in-depth case study, including literature studies, semi-structured open ended interviews, observations of daily routines and meetings, and document analysis of policy guidelines and national and lower level reports. The study was carried out in Chikwawa district (August to September 2006) and Chiradzulu district (September to October 2006) in the South West region of Malawi.
Important positive findings in this study included observed positive value of: (1) HMIS training; (2) HMIS review meetings; 3) supervision targeting HMIS routines; (4) an incentive scheme awarding health facilities on the quality of HMIS reports; and (5) health workers that were interested and willing to learn. Important findings of concern included; (1) problems of lack of skills and understanding among mainly lower level staff on the importance of data; (2) national priority of HMIS was not reflected in all practices at superior levels resulting in a lack of understanding of this priority at lower levels, (3) considerable amounts of adaptation work was crucial to the functioning of the system due to a general slip in computing resources; (4) job context factors not directly related to the functioning of the HMIS hold considerable high potentials for demotivation in general.
Analyses of findings in this study are based on motivational theory, using the terms of motivation and demotivation described by Herzberg et al. (1993) and the six categories of good and bad critical motivational incidents defined by (Machungwa and Schmitt 1983). Gasser’s (1986) definitions of primary, articulation, and adaptation work in relation to computing work are applied to address the supportive nature of HMIS work towards other work (management and patient care).
This research suggests that the motivational items identified by Machungwa and Schmitt (1983) are chiefly relevant to the Malawian context, but that values should be adjusted to the specific case of the health sector, as well as to the different values of workers at different levels within the sector. The different categories of personnel holding HMIS responsibilities of different natures are suggested to require different motivators. It is also suggested that the division of HMIS related work between HMIS staff and health worker is considered carefully, to improve the motivational potentials.
Hamre, Gro Alice. Motivation and demotivation among health staff at facility and districts level. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9731
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Hamre, Gro Alice&rft.title=Motivation and demotivation among health staff at facility and districts level&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-15071
62569
07102915x
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9731/4/Hamre.pdf
Motivation and demotivation among health staff at facility and districts level : A case study of the national Health Management Information system of Malawi
oai:www.duo.uio.no:10852/9300
2014-12-26T05:00:31Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
Integrasjon av datatyper og funksjoner i Creol.
Implementasjon av generelle datatyper og brukerdefinerte funksjoner som tidligere måtte håndkodes i interpreten. Dette omfatter også subtyper og brukerdefinerte datatyper. I tillegg blir også evalueringen av datatyper og funksjoner berørt. Ønsker å se på løsninger av dette.
Safadi, Nabil Mounzir. Datatyper og eksempelstudier i Creol. Hovedoppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9300
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Safadi, Nabil Mounzir&rft.title=Datatyper og eksempelstudier i Creol&rft.inst=University of Oslo&rft.date=2005&rft.degree=Hovedoppgave
URN:NBN:no-10642
28268
051369877
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9300/1/hovedfag.pdf
Datatyper og eksempelstudier i Creol
oai:www.duo.uio.no:10852/9919
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Hvordan kan man ta så gode gruppebeslutninger som mulig? Noen metoder
som det anbefales å bruke er gruppediskusjon, Delphi-metoden og statistisk gruppe. Ingen av dem gjør hva gode metoder egentlig bør: Stimulere kritisk tankegang, la de som er mer smarte eller som har viktige opplysninger få mer innflytelse, eller å tilby insentiver. ‘Prediksjonsmarked’, er en slags måte å skaffe prediksjoner på ved å få folk til å inngå veddemål. Det har egenskapene som andre metoder mangler, noe som også leder til bedre prediksjoner. Noen eksperter har uttalt at prediksjonsmarkeder er utsatt formanipulasjon. For eksempel fra de sommåtte ønske å sabotere prediksjonene. Forskning tyder på at dette ikke stemmer. Det er i praksis vanskelig å manipulere markeder.
Rogne, Øyvind. Put your money where your mouth is. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9919
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Rogne, Øyvind&rft.title=Put your money where your mouth is&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19181
78351
080980252
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9919/1/Rogne.pdf
Put your money where your mouth is : Prediksjonsmarkeders styrker og svakheter i lys av forskningen på gruppebeslutningstaking
oai:www.duo.uio.no:10852/9626
2014-12-26T05:14:49Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
1999
This research has been initiated on the basis of practical experiences in developing a relatively large SGML system at the University of Oslo. This thesis contributes to the field of information systems, with a particular focus on document systems. The aim of this work is to inform the design of document systems by considering the transformation from paper to digital documents in organizations. The Standard Generalized Markup Language (SGML, ISO 8879) approach is emphasized. The SGML approach takes the documents' structure and content as the starting point in design, and regards the document as a collection of structured information. This approach is challenged and tentatively improved by empirical studies of documents in use and theoretical considerations of artifacts at work.
The research approach has been an Action Case, as defined by Vidgen and Braa (1997). The interpretation of the transformation process from paper to digital documents is based mainly on an in-depth case study that was conducted at a Norwegian news agency from January 1996 to March 1998. The empirical findings are discussed according to theoretical concepts that emphasize the significance of artifacts at work to illuminate the various roles of documents at work.
Concepts from the Actor Network Theory (ANT) (for example, see Callon, 1986; Latour, 1987; Law, 1986) are applied to emphasize the interrelations of humans and artifacts, as well as the importance of artifacts' properties in these relations. The concepts of 'boundary object' (Star and Griesemer; 1989) and 'borderline issues' (Brown and Duguid, 1994) are applied to get various perspectives on the actor-network.
The study illustrates that it is challenging to substitute paper documents with SGML documents. Firstly, two different types of technology, with different properties and features, are exchanged. By removing paper documents, we also remove resources that go beyond the canonical meaning of the artifact. These resources are related to paper as a technology. Secondly, the document perspective in SGML is too restricted in relation to the various perspectives on documents in practical use. The emphasis on structure complicates the production of documents. Thirdly, the application of shared document models across work practices turns the various heterogeneous actor-networks into one network, which requires a common objective among the actors involved. The dilemma of "who does the job and who gets the benefits" (Grudin, 1989; 1994) arises as well.
The study indicates that an investigation of the actor-networks that include documents provides an insight into the more hidden aspects of work. By regarding documents' central, peripheral, local and shared properties, one can gain an understanding of how documents are embedded in work, including the importance of documents and related artifacts to aspects such as awareness, articulation and coordination of work. The properties determine how things become interrelated into heterogeneous networks. The research shows how a document's properties or inscriptions are essential to its production and application in use. Insight into these prerequisites helps us to understand how the computer system can fit into work practices, even if we do have no guarantees that it will be used in the way that we expect. According to design, work practices are improved by changing the technical properties or the technical fundamentals, by adding various inscriptions into the system. This thesis describes how an existing system was improved by the use of 'gateways'. In the design of the gateways, the idea has been to keep the technical possibilities that SGML provides, and at the same time take into account our knowledge about the paperwork.
Sandahl, Tone Irene. From paper to digital documents. Doktoravhandling, University of Oslo, 1999
http://hdl.handle.net/10852/9626
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Sandahl, Tone Irene&rft.title=From paper to digital documents&rft.inst=University of Oslo&rft.date=1999&rft.degree=Doktoravhandling
URN:NBN:no-14759
52808
070567883
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9626/1/Dr-Tone.pdf
From paper to digital documents : Challenging and improving the SGML approach
oai:www.duo.uio.no:10852/10026
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Current trends in communications technology favour modular, reusable service
components delivered over converged networks. This service oriented approach gives
rise to a new breed of composite, personalised services, but so far web-based providers
dominate this market. In order to adapt and compete, telecom operators need to implement
effective methods for developing, integrating and delivering services.
IP Multimedia Subsystem (IMS) is a fairly new framework being tested and rolledout
by telecom operators around the globe. It has a layered architecture that accommodates
the delivery of personalised, composite, multimedia services over converged
heterogeneous networks. However, the static nature of IMS service chaining limits its
flexibility.
This thesis proposes a model for dynamic service orchestration in IMS centred architecture.
The proposal addresses orchestration of typical IMS services running on
native SIP application servers as well as the incorporation of a variety of services residing
in foreign domains. In particular, the possibility to include external Web Services
in composite services the IMS domain is examined. A new hierarchical configuration
of service brokers is introduced and a basis prototype is implemented in the scenario
of a composite Presence service.
The proposed model in this thesis augments the current IMS service provisioning
mechanism in several ways: It introduces the notion of dynamic service brokering.
It adds explicit support for non-telecom services in native IMS application servers.
Furthermore, the proposed model utilises only reference points already present in the
existing IMS specification, no additional protocols or control functions were needed.
The functionality introduced is meant to improve the flexibility of IMS service provisioning
in terms of both the type of services that can be offered natively as well as the
types of services that can be supported from third parties.
The intention of this thesis is to provide a model for IMS service orchestration.
In particular, it identifies the technologies that make such an architecture feasible as
well as points out best practices for maintaining performance levels. The model was
verified by implementing a prototype that blends native IMS services and externalWeb
Services in the IMS domain.
Pearce, Arlene Marie. Service Orchestration in IMS Centred Architecture. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10026
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Pearce, Arlene Marie&rft.title=Service Orchestration in IMS Centred Architecture&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-24511
89333
101789068
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10026/1/Pearce.pdf
Service Orchestration in IMS Centred Architecture
oai:www.duo.uio.no:10852/8963
2014-12-26T05:10:28Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
Today computer and network security are a big part of a system administrators life. New methods and applications appear and several makes the system administrator’s job easier.
Gathering information is a big part of detecting threats and staying one step ahead of the black hats.
This thesis looks at and investigates a specific area in network security,
namely passive operating system detection.
Information is important in network security and knowing your enemies are important in securing your network.
Passive operating system detection helps collecting information passively, which can be used to the administrators advantage.
The thesis looks at passive operating system detection applications and looks especially on the applications p0f and prads.
By running both applications in a larger network and testing them in a
controlled environment, the weaknesses of both applications are revealed and improvements suggested and tried implemented.
Improvements discussed and tried implemented in this thesis, are adding new signatures and creating Perl scripts that improves the applications itself.
The scripts deals with the output from the applications which tends to be overwhelming and needs new presentation methods.
Falch, Petter Bjerke. Investigating Passive Operating System Detection. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8963
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Falch, Petter Bjerke&rft.title=Investigating Passive Operating System Detection&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-29212
132795
114802688
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8963/1/Masterxthesisxspringx2011.pdf
Investigating Passive Operating System Detection
oai:www.duo.uio.no:10852/9150
2014-12-26T05:11:21Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2004
A MANET is a multi-hop ad-hoc wireless network
where nodes can move arbitrary in the topology. The network has no
given infrastructure and can be set up quickly in any environment.
The Optimized Link State Routing(OLSR) protocol is a route
management protocol for such mobile ad hoc networks.
This study presents the work of implementing the
OLSR routing protocol. The implementation is done
in a modular fashion, allowing for the use of external plugins.
Also,this study analyzes certain extensions to the protocol done in
relation to the implementation, including Internet connectivity,
security and auto-configuration. More technical implementation
designs are also covered.
Tønnesen, Andreas. Impementing and extending the Optimized Link State Routing Protocol. Hovedoppgave, University of Oslo, 2004
http://hdl.handle.net/10852/9150
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tønnesen, Andreas&rft.title=Impementing and extending the Optimized Link State Routing Protocol&rft.inst=University of Oslo&rft.date=2004&rft.degree=Hovedoppgave
URN:NBN:no-9885
20235
042004152
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9150/1/report.pdf
Impementing and extending the Optimized Link State Routing Protocol
oai:www.duo.uio.no:10852/9749
2014-12-26T05:11:21Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
This project is about the description of ontologies for anomaly detection in computer systems. The special case of the anomaly detection system in Cfengine is used as a case study. Cfengine was designed at Oslo University College, based on a considerable body of research, and thus we have detailed insight into its operation. The Cfengine environment daemon collects many events in collaboration with cfagent that are presented to a system administrator for
further analysis and countermeasures. In this work we want to make use of ontologies to structure the knowledge in a way that makes the process of reasoning about anomalies clearer. Ultimately, one could imagine that ontology capabilities would enable computers to perform automatic filtering process through inferencing and reasoning about their problem space.
Adaa, Margareth Pancras. Ontology for host-based anomaly detection. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9749
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Adaa, Margareth Pancras&rft.title=Ontology for host-based anomaly detection&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-15271
63409
07101621x
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9749/1/adaa.pdf
Ontology for host-based anomaly detection
oai:www.duo.uio.no:10852/8711
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2010
The thesis explores how a web-based calendar and reminder service called BirthdayHero could help users improve how well they remember birthdays, anniversaries and other important events. A primary goal is to use the design process to better understand the relationship between internal and external information management. Internal information management is what we do when we use our memory to store and manage information. External information management is the use of physical or digital artifacts, like calendars or journals.
To understand user needs, interviews have been conducted with potential users. The interviews uncovered that people use many different tools to manage birthdays and events. Not all of their needs are met with the tools they are currently using. Almost all participants would have liked to get SMS text message reminders for important events. The strengths and weaknesses of using e-mail and SMS messages as reminders are discussed. The thesis argues that SMS reminders are the best choice for reminding users of events, because they are immediately received and are likely to be read.
The thesis is structured around four central research questions.
The first research question concerns the development of a usable web-based calendar and reminder service. Working prototypes of the calendar interface have been designed and tested for usability. The tests found that the prototypes are to some extent usable and satisfy user needs, but that there is still room for improvement.
Second, the differences between internal and external information management are explored. This is done to outline the strengths and weaknesses of relying on our memory to remember important dates, versus relying on calendars, organizers or other physical artifacts.
Third, the contrasts between appliance and general purpose services are discussed. The thesis argues that BirthdayHero could fulfill an unmet user need as a backup service. It can ensure that users can’t forget important events, while keeping the complexity of the service to a minimum. This entails that the service will be an appliance service, focusing on doing one thing, and doing it well.
Finally, the thesis presents a method of improving how well users memorize dates, based on a learning technique called spaced repetition. Through a prototype design, it explores how the service could improve how well users remember events even without reminders.
Tronstad, Jørgen Aares. Creating a usable web-based calendar and reminder service. Masteroppgave, University of Oslo, 2010
http://hdl.handle.net/10852/8711
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tronstad, Jørgen Aares&rft.title=Creating a usable web-based calendar and reminder service&rft.inst=University of Oslo&rft.date=2010&rft.degree=Masteroppgave
URN:NBN:no-25669
102017
102189978
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8711/4/Tronstad.pdf
Creating a usable web-based calendar and reminder service : an investigation of internal and external information management
oai:www.duo.uio.no:10852/8727
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2010
Mobile ad hoc networks (MANETs) often suffer from disruptions
and partitioning. Techniques to tackle these challenges such as
caching and replication might result in data being widely dis-
tributed in the network. For enabling media streaming in such
networks, the requirement for an improved signaling system has
been raised. Therefore, we have in this thesis, designed and imple-
mented a solution for signaling of media streaming in MANETs.
This solution is able to create meta data that is kept and sent
together with audio and video (AV) data, so specific data can be
found and retrieved if disruption and partitioning occurs. This
also provides the ability to gather meta data and create overviews
of where AV data is distributed in the network. In addition it
provides detailed control over the streaming process so users can
choose whether missing data is important enough to be retrieved
or not.
Since there are currently no existing media players that supports
this functionality, we have also provided the design of an exper-
imental media player. This media player is tailored to support
the specifics of our signaling system, and makes it easier to il-
lustrate benefits of the functionality. Trough evaluation we have
verified that the user is provided this fine-grained control of dif-
ferent streaming sessions, in addition to being informed with as
much as possible details of the process.
Dybsjord, Lars Olav. DTSS - Signaling for Media Streaming and Delay Tolerant Network. Masteroppgave, University of Oslo, 2010
http://hdl.handle.net/10852/8727
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Dybsjord, Lars Olav&rft.title=DTSS - Signaling for Media Streaming and Delay Tolerant Network&rft.inst=University of Oslo&rft.date=2010&rft.degree=Masteroppgave
URN:NBN:no-25815
102841
102245592
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8727/1/Dybsjord.pdf
DTSS - Signaling for Media Streaming and Delay Tolerant Network
oai:www.duo.uio.no:10852/9738
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
På starten av 1970-tallet var store deler av forskningsområdet innen kunstlig intelligens preget av avgrensede problemstillinger i forenklede verdener. Ideen var at dersom det fantes en god løsning på et problem i en forenklet verden, kunne denne videreføres til vår komplekse virkelighet. Viktig forskning ved Massachusetts Institute of Technology ble lagt ned i problemstillinger knyttet til en tenkt verden bestående av klosser med ulik form. Terry Winograds SHRDLU-program er det mest kjente fra denne epoken.
I denne masteroppgaven beskriver jeg utviklingen av en klosseverden, basert på SHRDLU sin verden, med en robot som har en kunnskapsrepresentasjon slik at den kan resonnere seg frem til løsningen på ulike problem den blir stilt ovenfor.
Klosseverden består av en serverapplikasjon utviklet i CLISP som simulerer verdenen. Verden visualiseres ved bruk av en Java- eller Flash-klient. Standardiserte grensesnitt i klosseverdenen gjør det enkelt å teste ut ulike roboter.
Som kunnskaps- og planleggingssystem benytter roboten i klosseverdenen et egetutviklet språk, Closlog. Det benytter en objektorientert kunnskapsrepresentasjon, som muliggjøres ved bruk av CLISP sitt objektorienterte system.
I oppgaven viser jeg hvordan et enkelt problem, som å finne en mulig vei mellom klossene i klosseverden, kan gi et svært stort søkerom. Ulike søkestrategier i Closlog viser hvordan roboten er i stand til å løse enkle, men også mer komplekse oppgaver som 8-spillet.
Avslutningsvis diskuterer jeg fremtidige bruksområder for klosseverdenen. Roboten kan utvides med en modul som forstår naturlig språk. Jeg viser også hvordan Closlog sin representasjon av kunnskap gjør roboten i stand til å lære av tidligere erfaring.
Vedlagt til oppgaven er en CD med dokumentasjon, kildekode og en kjørbar versjon av klosseverden.
Jøssang, Tom Andreas. Klosseverden. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9738
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Jøssang, Tom Andreas&rft.title=Klosseverden&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-15079
62908
071015779
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9738/1/Jossang.pdf
Klosseverden : - problemløsning ved bruk av Closlog
oai:www.duo.uio.no:10852/9936
2013-03-12T07:57:38Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2002
Oppgaven ser på hendelsesbasert kommunikasjon mellom komponenter i et distribuert system, og er den del av Distribuert Multimedia Jornalering (DMJ) prosjektet ved Simulasenteret på Fornebu. Oppgaven ser på tre ulike meldingsformidlere, Message Bus (Mbus), Scalable Internet Event Notification Architecture (SIENA) og Common Object Request Broker Architecture (CORBA) sin Notification Service. De tre meldingsformidlerene blir introdusert og vurdert med hensyn på de funksjonelle og ikke-funksjonelle krav som stilles til kommunikasjonen i DMJ-prosjektet og oppgavens mål er gi prosjektet best mulig grunnlag for valg av medingsformidler i sitt prosjekt. Oppgaven identifiserer styrker og svakheter ved de ulike meldingsformidlerene, presenterer målinger gjort for dem og kommenterer hvordan prosjektets kommunikasjon vil påvirkes ved bruk av hver enkelt av de tre meldingsformidlerene. Konklusjonen på valg av meldingsformidler er ikke entydig.
Tyvand, Kjetil. En vurdering av meldingsformidlere til bruk i DMJ-prosjektet. Hovedoppgave, University of Oslo, 2002
http://hdl.handle.net/10852/9936
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tyvand, Kjetil&rft.title=En vurdering av meldingsformidlere til bruk i DMJ-prosjektet&rft.inst=University of Oslo&rft.date=2002&rft.degree=Hovedoppgave
URN:NBN:no-5284
8025
022768424
En vurdering av meldingsformidlere til bruk i DMJ-prosjektet
oai:www.duo.uio.no:10852/8848
2015-02-13T05:03:48Z
com_10852_2
com_10852_1
com_10852_169
com_10852_87
col_10852_3
col_10852_170
00925njm 22002777a 4500
dc
2010
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings associated with a set of short sounds. Although the subjects' associations to sounds are very subjective, and thus the resulting tracings are very different, an attempt is made at extracting some global features which can be used for comparison between tracings. These features are then analyzed and classified with an SVM classifier.
http://hdl.handle.net/10852/8848
40243
63
66
URN:NBN:no-28128
115539
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8848/1/Glette_2010.pdf
Extracting action-sound features from a sound-tracing study
oai:www.duo.uio.no:10852/9998
2014-12-26T05:03:35Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
Denne hovedfagsoppgaven handler som ledere som er mobile i sin arbeidshverdag. Problemstillingens formulering har vært: Hvordan oppfyller en mobil leder sine ledelsesroller, og hvordan kan IKT støtte lederen i dette? For å belyse problemstillingen er det foretatt etnografiske undersøkelser blant formenn på et byggeprosjekt. Disse formennene er ledere for mange ansatte som skal stå for oppføringen av et bygg, og de er i stadig bevegelse ute på byggefeltet. Undersøkelsene viser at formennene ble mobile for å holde seg informerte. En stor del av deres beslutningsgrunnlag var kjennskap til det som skjedde ute på byggefeltet. De måtte være oppdatert om det som skjedde i sine omgivelser for å kunne utøve sine lederroller. I oppgaven kalles dette en bevissthetskomponent ved ledelse; gjennom denne komponenten holder lederne seg informerte. Formennene forholdt seg bevisst sine omgivelser ved å vandre rundt og kommunisere muntlig ansikt til ansikt med de ulike aktørene på byggeprosjektet. Undersøkelser viser at ledere foretrekker slik verbal ansikt til ansikt kommunikasjon. Mye av denne interaksjonen går ut på ren informasjonsoverføring. I oppgaven foreslås mobil awareness-teknologi som IKT-støtte for mobile ledere. Denne teknologien skal støtte mobile mennesker i å hyppig og effektivt innlede uformell kommunikasjon med andre. Uformell kommunikasjon ansees som en svært viktig faktor for å oppnå Awareness på arbeidsplassen. Awareness er kunnskap som skapes gjennom interaksjon med andre. Det handler om å være bevisst sine omgivelser og det som skjer rundt en. Dette skaper en forståelse for andres aktiviteter, hvilket danner en kontekst for ens egen aktivitet. Teknologien vil kunne hjelpe mobile ledere bli mer bevisst det arbeidet som foregår rundt dem; hvilket utgjør en viktig del av deres beslutningsgrunnlag. I oppgaven blir det presentert et rammeverk for design av mobile awarenesssystemer som kan være nyttig dersom en vurderer å innføre et slikt system i en organisasjon. Rammeverk som presenteres anbefales brukt som et hjelpemiddel til å utforme en skisse til kravspesifikasjon og implikasjoner for design av mobile awareness-systemer.
Talgøe, Kristin. Mobil Ledelse. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/9998
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Talgøe, Kristin&rft.title=Mobil Ledelse&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-5185
8724
030454093
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9998/1/talgoe.pdf
Mobil Ledelse : Mobil Awareness Teknologi som støtte for mobile ledere
oai:www.duo.uio.no:10852/9335
2014-12-26T05:03:36Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2002
In this cand.scient. thesis we propose a strategy for testing validity
of decomposition of contract oriented specifications. The strategy is
based on Abadi and Lamport's Composition Theorem for the
Temporal Logic of Actions and test case generation from
executable specifications.
A composition rule, inspired by the Compositon Theorem, is formulated
in a semantics based on timed streams. A subset of the
Specification and Decription Language (SDL) is defined and
the SDL subset is formalized in the semantics.
A simplification of the testing strategy was realized in an
experimental prototype tool for testing of contract decompositions in
SDL. In addition another prototype tool based on a conventional
strategy was built as a reference tool.
Testing of the two tools showed that both validated valid contract
decompositions and falsified invalid contract decompositions. Testing
also showed that the tool based on the composition rule in some
interesting situations was considerably more efficient than the tool
based on the conventional strategy.
Lund, Mass Soldal. Validation of Contract Decomposition by Testing. Hovedoppgave, University of Oslo, 2002
http://hdl.handle.net/10852/9335
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Lund, Mass Soldal&rft.title=Validation of Contract Decomposition by Testing&rft.inst=University of Oslo&rft.date=2002&rft.degree=Hovedoppgave
URN:NBN:no-3283
2942
02195156x
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9335/1/lund.pdf
Validation of Contract Decomposition by Testing
oai:www.duo.uio.no:10852/9490
2017-12-07T13:05:06Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
Video streams in general do not adapt to changes in the network environment, causing image quality to suffer. Graceful video scaling requires fine granular adaption, and scalable video codecs like SPEG and MPEG-4 FGS provide this ability. Little research has been done regarding how continuous quality changes affect perceived video quality, and how existing metrics can be used to measure this. There is no objective metric that addresses this particular problem.
Some objective metrics emulate the human visual system, and the project compared those with subjective results. The comparison indicated objective tests could be used in place of subjective tests, which are more resource expensive. More testing was needed to draw any conclusions about this topic.
Results from the quality evaluation tests could be used to generate a utility function for measuring perceived quality in scalable video. It would be based on exponential functions describing each test parameter. Due to time constraints, only an abstract method to approach this problem was proposed.
Some shortcomings were observed. The test parameters were not numerous enough, and their values were not distanced far enough apart. Testing performed with the chosen subjective evaluation method, DSCQS, resulted in a data set that was too small for serious usage. The test should have had access to more participants, and the viewers should have seen fewer clips but with more parameter variations.
Ruud, Bjørn Olav. Video Quality Measurement of Scalable Video Streams. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9490
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Ruud, Bjørn Olav&rft.title=Video Quality Measurement of Scalable Video Streams&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12569
42402
061227706
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9490/1/Ruud.pdf
Video Quality Measurement of Scalable Video Streams
oai:www.duo.uio.no:10852/8879
2017-12-07T13:05:07Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
In recent years there has been a rapid development in sequencing technologies. These new technologies produce data in the order of several gigabase-pairs per day. The sequences produced are short and numerous. These short sequences are often used for resequencing. Resequencing is when sequence DNA from an organism with a known genome sequence is aligned to a reference genome of the organism. Doing this alignment with the traditional alignment tools like BLAST have proved to be too time-consuming, and because of this several new short sequence aligners been developed. These new tools are much faster than the traditional tools. I wanted to study whether a GPU could be used to create a faster tool for this, because a GPU is a great tool to speed up algorithms through massive parallelism.
I have developed GPUalign, a short read alignment tool. It uses a simple hash based index algorithm that aligns the reads with massive parallelism on the GPU. Tests have evaluated speed and accuracy of GPUalign and compared it to the state of the art tool BWA. GPUalign performed well and showed a great potential for the use of GPUs in short sequence alignment. At the same time GPUalign also have much room for further improvements in speed and accuracy. GPUalign should also scale well with future improvements in GPU technology.
Ruud, Bjørnar Andreas. Parallel alignment of short sequence reads on graphics processors. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8879
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Ruud, Bjørnar Andreas&rft.title=Parallel alignment of short sequence reads on graphics processors&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-29193
118595
114544042
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8879/3/Ruud.pdf
Parallel alignment of short sequence reads on graphics processors
oai:www.duo.uio.no:10852/9144
2014-12-26T05:03:46Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2004
Abstract
Because of high market competition between services in today’s multi-provider environment, means to regulate quality of service (QoS) is required to guarantee for providers’, users’ and other actors’ rights when purchasing/selling a service. There are already a multiple of existing telecom services available to users, and with implementation of next generation networks (NGN), or similar future network propositions, the number of services and different providers are expected to increase rapidly. More than one actor may be involved in delivering a service, and the needs for regulation of QoS between actors involved in delivering a service are therefore even more present in NGN.
Creating a model with effective traffic and network handling methods for packet based networks, using tools such as MPLS, DiffServ, routing mechanisms and SLAs is one of the future goals. This thesis concentrates on methods of how such a model can be realized, using different tools.
One proposal of regulating QoS is to implement service level agreements (SLA) as standard, regular agreements being used whenever an exchange of service resources is made. An SLA should include service description, user’s and provider’s rights in terms of delivery, faults, service degradation, monitoring, pricing etc. Other quality of service terms which are service specific are often included in attachments or as individual QoS agreements. SLAs can also be used by customers to compare similar services, that way improving the competing environment.
Standardizing SLAs has been a main focus for many standardization organizations; how can the SLAs guarantee quality of service, as opposed to free market competition and legal regulation. A naïve perspective may be to say that the laws in Norway and free market competition are sufficient to regulate the multi-provider environment, but there are reasons why it is not so. Service Level Agreements are used to ensure that all relationships in an actor network environment, be it service providers, network operators, customers etc, operate in the “correct” way; ensuring quality of service, traffic engineering, and economic and legal issues. This is why the content of SLAs is one of the “hottest” topics among big actors in the telecommunication sector.
Sørsdal, Nina. Service Level Agreements between actors. Hovedoppgave, University of Oslo, 2004
http://hdl.handle.net/10852/9144
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Sørsdal, Nina&rft.title=Service Level Agreements between actors&rft.inst=University of Oslo&rft.date=2004&rft.degree=Hovedoppgave
URN:NBN:no-9860
20035
041977726
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9144/1/hele.pdf
Service Level Agreements between actors
oai:www.duo.uio.no:10852/9557
2017-12-07T13:05:07Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
This thesis presents a system for adapting the bandwidth of a Delta-Sigma DAC based on the spectral contents of the input signal. By dynamically adjusting the bandwidth of the converter based on the transitory requirements of the input signal, quantisation noise can be suppressed further in frequency bands that are idle. It is shown how a compression scheme can be exploited to efficiently obtain the spectral information needed.
A simulation model of the system is used to quantify the performance gain experimentally. The results obtained from the simulations substantiate the claim of a performance increase that outweighs the complexity incurred by the approach for certain classes of input signals.
Michaelsen, Jørgen Andreas. Suppression of Delta-Sigma DAC Quantisation Noise by Bandwidth Adaptation. Hovedoppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9557
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Michaelsen, Jørgen Andreas&rft.title=Suppression of Delta-Sigma DAC Quantisation Noise by Bandwidth Adaptation&rft.inst=University of Oslo&rft.date=2006&rft.degree=Hovedoppgave
URN:NBN:no-13424
46168
061870897
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9557/1/Michaelsen.pdf
Suppression of Delta-Sigma DAC Quantisation Noise by Bandwidth Adaptation
oai:www.duo.uio.no:10852/9999
2017-12-07T13:05:07Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Oppgaven tar for seg bruk av informasjonssystemer i utrykningssituasjoner (ambulanse). Gjennom delvis medvirkende observasjon (cirka 270 timer), er det fors�kt definert spesielle krav som kan stilles til systemutvikling for den spesielle brukergruppen ambulansepersonell er. Fra et begynnende fokus p� brukbarhetsaspekter og CHI, dreier oppgaven mot en konklusjon hvor fokus for vellykket systemutvikling for ambulansebransjen, menes � b�r ligge i en bedre forst�else av brukskontekst - og brukskontekstens innvirkning p� ambulansepersonellets forhold til informasjon.
Hansen, Geir Ole. Utfordringer for ambulansepersonell og systembruk. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9999
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Hansen, Geir Ole&rft.title=Utfordringer for ambulansepersonell og systembruk&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21044
87364
091862116
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9999/1/Hansen.pdf
Utfordringer for ambulansepersonell og systembruk
oai:www.duo.uio.no:10852/8984
2017-12-07T13:05:07Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
Adaptive HTTP streaming is frequently used for both live and on demand video delivery over the Internet. Adaptiveness is often achieved by encoding the video stream in multiple qualities (and thus bitrates), and then transparently switching between the qualities according to the bandwidth fluctuations and the amount of resources available for decoding the video content on the end device. For this kind of video delivery over the Internet, H.264 is currently the most used codec, but VP8 is an emerging open source codec expected to compete with H.264 in the streaming scenario. The challenge is that, when encoding video for adaptive video streaming, both VP8 and H.264 run once for each quality layer, i.e., consuming both time and resources, especially important in a live video delivery scenario.
In this thesis, we address the resource consumption issues by proposing a method for reusing redundant steps in a video encoder, emitting multiple outputs with varying bitrates and qualities. It shares and reuses the computational heavy analysis step, notably macro-block mode decision, intra prediction and inter prediction between the instances, and outputs video in several rates. The method has been implemented in the VP8 reference encoder, and experimental results show that we can encode the different quality layers at the same rates and qualities compared to the VP8 reference encoder, while reducing the encoding time significantly.
Finstad, Dag Haavi. Multi-Rate VP8 Video Encoding. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8984
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Finstad, Dag Haavi&rft.title=Multi-Rate VP8 Video Encoding&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-29844
135255
120377225
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8984/1/Finstad.pdf
Multi-Rate VP8 Video Encoding
oai:www.duo.uio.no:10852/9956
2017-12-07T13:05:08Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
During the past ten years digital technology has entered the world of mammography. It has not taken completely over, however, because there are still disagreements as to whether the digital technology is as good as the analog at detecting breast cancer. The amount of radiation exposed to the patient is another issue, because larger amounts of radiations are used with the digital technology.
To contribute to the improvement of the digital mammography technology, research has been done on the pixels used in the x-ray sensors.
Different pixel architectures have been implemented in a standard CMOS technology and compared with respect to sensitivity to low light exposure. This work has been done in two phases. First some test pixels were implemented on the side of an image sensor belonging to an EU funded project (I-ImaS). Based on the results from this testing and new calculations a new set of test pixels were implemented for further testing.
The first set of test pixels, designed by research scientists at SINTEF, were implemented as 3-transistor pixels (3T). The size of the photo diode is varied, as is the size of the source follower transistor (SF) in the pixel. The pixels were compared with respect to sensitivity. In addition some pixels are covered with metal while their neighbouring pixels are uncovered. This way influence between pixels is measured.
In the second set of test pixels there are 3-transistor pixels just like in the first set. In addition 4-transistor pixels (4T) and photo gate pixels (PG) are implemented on the same chip. The same variations in photo diode and SF size have been made to the 3T pixels to further investigate the results from the first testing. In addition another variation has been made to some of the 3T and 4T pixels: The n-well photo diode has been made with one or more holes, in order to decrease the area and thus the capacitance of the diode without decreasing the reach (radius) and charge collection of the diode.
Løkken, Kristin Hammarstrøm. Active pixels for x-ray applications, implemented in a standard CMOS technology. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9956
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Løkken, Kristin Hammarstrøm&rft.title=Active pixels for x-ray applications, implemented in a standard CMOS technology&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19844
82031
091962374
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9956/1/Lokken.pdf
Active pixels for x-ray applications, implemented in a standard CMOS technology
oai:www.duo.uio.no:10852/9435
2014-12-26T05:00:36Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
The World Wide Web is drowning with too much content. Stagnant websites, dead hyperlinks, inconsistent web-design and chaotic site-maps are all symptoms of a polluted Web where valuable content is hard to find. Web content management (WCM) systems have become an increasingly popular solution to these problems. In fact, these systems are so high in demand that competitive vendors seek to lock their users to their proprietary solutions and standards. An anti-reaction to this trend is the range of open source solutions appearing to relieve the web content pressure, as well as an emerging suite of open standards specifying how web content can be transported and stored. By developing WCM systems, both inside a commercial company, and by participating in an open source project, we have disclosed the relations between web content management, open standards and open source software. The results include how certain requirements of WCM systems are influenced by open source environments and the use of open standards, as well as the implications such environments have for developers.
Nicolaisen, Thomas Ferris. The Use of Open Source and Open Standards in Web Content Management Systems. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9435
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Nicolaisen, Thomas Ferris&rft.title=The Use of Open Source and Open Standards in Web Content Management Systems&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12354
40125
060953969
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9435/1/thesis.pdf
The Use of Open Source and Open Standards in Web Content Management Systems
oai:www.duo.uio.no:10852/8942
2017-12-07T13:05:08Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
Recently IPv6 has become a worldwide topic because of IPv4 address exhaustion, much is due to the Internet Assigned Numbers Authority (IANA) allocated the final IPv4 address blocks in February, 2011. IPv6 solves the addressing problem, however there is a low IPv6 adoption rate. Some of the reasons for low adoption rate are high adoption costs, return on IPv6 technology is not proven and solutions as Classless Inter-Domain Routing and Network Address Translation delayed the IPv4 exhaustion.
This thesis investigates the IPv6 protocol by conducting several measurements over longer distances. One of the studies will give an indication of the rate the IPv6 is being adopted in the world, by monitoring over seven thousand University webservers over a period of two weeks. Another study compares the TCP throughput performance of WWW protocol on five University webservers for both IPv6 and IPv4 networks over a period of two weeks.
The results indicated that the TCP throughput performance differences between the two Internet protocols were almost the same. The IPv6 adoption rate study did not result in a conclusive way, however an indication of an increase was observed and by additionally surveying previous work, and observing IPv6 monitoring websites on the
IPv4 Internet, one can state that it is most probable that the IPv6 are being adopted continuously.
Yildirim, Ali Emre. Measuring IPv6 adoption rate and performance in the Internet. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8942
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Yildirim, Ali Emre&rft.title=Measuring IPv6 adoption rate and performance in the Internet&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-29850
129715
120426900
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8942/2/Yildirim.pdf
Measuring IPv6 adoption rate and performance in the Internet
oai:www.duo.uio.no:10852/10010
2017-12-07T13:05:08Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Software applications dependent on frequent user interaction need user interfaces with a high degree of usability. However, creating high usability user interfaces is a challenging task for software developers. The results may be highly dependent on individual developers‟ knowledge about usability. User-Centered Design suggests another approach, possibly giving more predictable results: to involve users in the development and evaluation of the user interfaces. To enable such user involvement, we developed a web application called the Design Feedback Tool. It presents multiple design alternatives to users and allows them to comment and rate them. A tailored development method was used for creating the specification, design and implementation of the application. Furthermore, a paper prototype of its user interface was evaluated using the Heuristic Evaluation method, and a security risk analysis was conducted. Documentation and lessons learned may be useful for software developers and researchers creating similar applications, as well as for research into use of software development methods. Furthermore, the results of the prototype evaluation and security risk analysis may be of interest to researchers in these areas.
Warholm, Lars. Software for user evaluation of user interface designs. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10010
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Warholm, Lars&rft.title=Software for user evaluation of user interface designs&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21050
88211
090021398
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10010/2/Warholm.pdf
Software for user evaluation of user interface designs
oai:www.duo.uio.no:10852/8943
2022-01-20T23:37:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
This Master thesis investigate the predictability in game server resource data by implementing and developing a predictive algorithm. Thorough testing of the algorithm has been performed and the results show that the game server resource data is predictable to some extent. The findings in this thesis introduce the possibilities to predict allocation of sufficient resources to game servers.
Tyvand, J. E. On the predictability of server resources in online games, an investigative approach. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8943
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tyvand, J. E.&rft.title=On the predictability of server resources in online games, an investigative approach&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-29851
129716
120426935
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8943/1/Tyvand.pdf
On the predictability of server resources in online games, an investigative approach
oai:www.duo.uio.no:10852/9709
2013-03-12T07:57:57Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2002
Datamaskinen har gjennomgått store endringer de siste 50 årene, og fra
å bli oppfattet som et verktøy, blir den nå av mange også oppfattet
som et medium. I Norge er satsingsomfanget for IKT (Informasjons- og
kommunikasjonsteknologi) i undervisningen omfattende og det å ta i
bruk undervisningsapplikasjoner har elevene etterhvert blitt vant
til. Med datamaskinens utvikling og ulike ``roller'' vil vi imidlertid
etterhvert oppleve undervisningsapplikasjoner som er en kombinasjon av
det å få presentert stoff og å produsere noe selv og hvor man i
grensenittet får direkte tilgang på det presenterte materialet i
produksjonen.
Denne oppgaven tar for seg hvordan det kan tilrettelegges for en slik
kombinasjon i en applikasjon. I den anledning har jeg deltatt i et
feltforsøk hvor elever tester ut en læringsapplikasjon ved fire skoler
i Norge.
De ferdige resultatene som ble produsert av elevene ble studert og
sett de i sammenheng med hvordan de kom til under
arbeidsprosessen. Teoristudier som tar for seg datamaskinen som
henholdsvis et verktøy og et medium har vært til hjelp for å kunne
forklare de empiriske funnene.
Applikasjonen jeg vurderte syntes å fungere godt som et verktøy og
alterneringen mellom å få presentert materiale og å produsere
materiale selv foregikk uten større problemer. Det å kunne se gjennom
sin egen presentasjon under hele prosessen syntes å være en
engasjerende faktor for brukerne. Applikasjonen syntes også å ha en
for tett kobling mellom presentasjon og produksjon i forhold til
tilgang på materiale fordi elevene benyttet seg i stor grad av dette
materialet uten noen form for bearbeiding. Dette er en faktor som bør
vurderes ved design av en applikasjon med integrert produksjon og
presentasjon.
Lien, Sigrun Vedø. Produksjon og presentasjon i multimedie-undervisningsverktøy. Hovedoppgave, University of Oslo, 2002
http://hdl.handle.net/10852/9709
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Lien, Sigrun Vedø&rft.title=Produksjon og presentasjon i multimedie-undervisningsverktøy&rft.inst=University of Oslo&rft.date=2002&rft.degree=Hovedoppgave
URN:NBN:no-5219
6082
022019979
Produksjon og presentasjon i multimedie-undervisningsverktøy
oai:www.duo.uio.no:10852/9012
2014-12-26T05:03:54Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
Amanda, an architecture for adaptive, autonomous searching guided by user behvior Bjørn Remseth, rmz@rmz.priv.no Date: 30. Oct. 2003 Original title: Amanda: En arkitektur for adaptiv og autonom søking styrt av brukeradferd This thesis presents a research problem regarding the design of a system that will continually present a user with advice that are relevant to the task the user is pursuing during an interaction with a computer. An answer to the research problem is presented through the spesification and partial implementation of an agent bsed architecture called Amanda. Amanda collects traces left by user behavior in multiple location, transforms these into queries and uses the results from these queries to provide advice to the user. Amanda s architecture is applicable to many kinds of user interaction, but the implementation is restricted to a case where the user is working on a task in an text editor, and documents are read through a web-browser. The technical design uses elements from the theory of Information Retrieval and Adaptive Autonomous Agents, but combines them in a somewhat novel manner. Litterature studies from a set of works in the fields of Just In Time Information Retrieaval , search engines and message delivery are used both as sources for design elements used in Amanda, but also to test the architecture to see if the selected works can be implemented in or integrated with Amanda, and by and large they can. This indicates that the arcitecture is robust with regards to different types of searcing, sources of user behavior and user interfaces. og brukergrenssnitt mot brukeren. The partial implementation demonstrates that the parts of the architecture that was implemented were sound and able to deliver the specified behavior. In the chapter describing future work, several avenues of work are explored both with regard to scaling in volume, and types of input/output presented to the system. It is indicated that using the architecture to present information to mobile users through their celluar phones might be an interesting direction. 1
Remseth, Bjørn. Amanda - en arkitektur for adaptiv autonom søking styrt av brukeradferd. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/9012
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Remseth, Bjørn&rft.title=Amanda - en arkitektur for adaptiv autonom søking styrt av brukeradferd&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-7114
14296
032005180
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9012/1/oppgave-2003-10-30.pdf
Amanda - en arkitektur for adaptiv autonom søking styrt av brukeradferd
oai:www.duo.uio.no:10852/9316
2013-03-12T07:57:58Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
Applikasjoner som bruker IP-nettverk setter statig større krav til QoS, mens dagens IP nett kun kan tilby en best effort ytelse. Samtidig blir trådløse nettverk mer og mer populært og benyttet i stadig større grad. IEEE 802.11e er en standard for QoS forbedringer av WLAN MAC laget. Oppgaven undersøker hvordan IETF Differentiated Services arkitekturen kan benyttes over 802.11e for å gi ende-til-ende QoS mellom de kommuniserende noder. Oppgaven gir forslag til arkitekturer for bruk av Differentiated Services over 802.11e, og verifiserer også hvor brukbar 802.11e standarden er for Differentiated Services ved hjelp av simuleringer.
Selvig, Bjørn. Differentiated Services in 802.11e WLAN. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9316
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Selvig, Bjørn&rft.title=Differentiated Services in 802.11e WLAN&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10964
29002
051178729
Differentiated Services in 802.11e WLAN
oai:www.duo.uio.no:10852/9622
2017-12-15T12:16:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
The concept of structured documents has been well-established since
the 1980's. Current word processors, e.g. Microsoft Office and
OpenOffice.org are able to save documents in structured XML-based
formats. However, even if both formats are written in XML they
differ in semantics. The topic of this thesis is to evaluate how
content and structure is stored in the two formats, and whether it is possible to write a two-way converter program between them.
Johansen, Sverre. Comparing Semantics in Office Document Formats. Hovedoppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9622
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Johansen, Sverre&rft.title=Comparing Semantics in Office Document Formats&rft.inst=University of Oslo&rft.date=2007&rft.degree=Hovedoppgave
URN:NBN:no-14788
52649
070201773
Comparing Semantics in Office Document Formats
oai:www.duo.uio.no:10852/9317
2014-12-26T05:03:26Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
The thesis numerically investigates the stability and convergence properties of the Parareal algorithm when it is run on the unsteady Stokes equations. The Parareal algorithm offers a parallel-in-time scheme for solving time dependent differential equations. The strategy of the algorithm follows the principles of domain decomposition.
The unsteady (or time dependent) Stokes equations are a set of partial differential equations (PDEs) that describe creeping flow. They are an important simplification of the more complex Navier-Stokes equations central to fluid dynamics. The motivation for wanting to use Parareal with the unsteady Stokes equations is to obtain faster computations in the temporal domain.
The stability and convergence are tested by using variations of the theta-rule for discretizing the temporal domain. The results were compared to similar numerical tests of the algorithm used with the heat equation. Prior analyses show that the algorithm will display proper convergence and stability traits for this equation. For stiff systems of ODEs the Parareal analysis states that the algorithm is stable when the coarse propagator uses theta in the range [2/3,1]. The unsteady Stokes equations are parabolic PDEs, and when semi-discretized in space they become systems of stiff ODEs. We therefore believe that the Parareal algorithm will remain stable and convergent when run on this problem.
Our numerical results indicate that the Parareal algorithm is indeed stable for [2/3,1] when it is used to solve the unsteady Stokes equations, although some uncertainty on its convergence rate is experienced at 2/3.
The thesis also numerically investigates the common estimate of the current error in the algorithm, which is used to determine convergence. We have performed numerical tests that indicate that the ratio of the true error and the approximative error is constant, which suggests that this is indeed a good estimate of the error at iteration k.
The thesis was solved using a combination of Python and the C++ library Diffpack, where all governing code is written in Python. Through its successful usage in this project, the thesis implementation acts as a proof-of-concept to that such a combination is indeed possible for solving problems like the unsteady Stokes equations.
Ligner, Erica Madeleine. Solving the Unsteady Stokes Equations using the Parareal Algorithm. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9317
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Ligner, Erica Madeleine&rft.title=Solving the Unsteady Stokes Equations using the Parareal Algorithm&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10965
29004
051545780
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9317/1/report.pdf
Solving the Unsteady Stokes Equations using the Parareal Algorithm
oai:www.duo.uio.no:10852/9433
2013-03-12T07:57:59Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
This thesis describes a project which goal was to develop a programming contest system. The programming contest system was designed to be robust, easy to install and maintain, and easy to use.
A web-based system has been developed, implemented in Perl and C.
Open source modules and applications such as PostgreSQL has been used extensively.
The teams participating in a programming contest have an interface where they can submit solutions to the system. The system has implemented fully automatic judging where the system runs and judges all submitted solutions. Each team has its own status page which is updated by the system regularly to give feedback to the team.
The contest administrator has his/her own web interface to
administrate the contest system.
Johnsen, Merethe. Programvare for programmeringsmesterskap. Hovedoppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9433
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Johnsen, Merethe&rft.title=Programvare for programmeringsmesterskap&rft.inst=University of Oslo&rft.date=2006&rft.degree=Hovedoppgave
URN:NBN:no-12352
40123
060774509
Programvare for programmeringsmesterskap
oai:www.duo.uio.no:10852/9668
2017-12-15T14:12:26Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
This thesis presents an analysis and implementation of accelerated power quality signal processing in a field programmable gate array (fpga).
The purpose of this study is to show that a medium size FPGA has the necessary resources to perform realtime analysis.
The research is part of an ongoing development project in digital monitoring and autonomous correction of faults.
Current implementations are based on an external processor or digital signal processor (DSP) and an FPGA.
Several alternative system on chip (soc) architectures are reviewed.
A combination of a softcore microprocessor and hardware accelerators which natively work on data in a polar vector based format is proposed.
Methods are discussed not only with performance in mind, but rapid development time and solutions which are familiar to developers are requirements as well.
Implementations are simulated and verified in hardware with artificial stimuli.
The result is a design combining a general purpose softcore microprocessor and several hardware accelerators.
This yields realtime performance with a footprint to match a medium size FPGA.
It is also shown that some signal enhancing functions applied to correct for sensor- and sampling peculiarities can be performed very efficiently in software when working with polar vectors.
Further, it is shown that transferring data more efficiently between a hardware accelerator and a microprocessor increases throughput by up to 55%.
A method producing accurate results over a larger frequency range is introduced.
The method adopts the sample interval to the frequency of the signal by measuring vector velocity.
This is both efficient and offers self calibration.
Compared to a previously developed implementation it requires significantly less resources.
Embedded software developed takes advantage of dedicated hardware and provides an efficient and flexible environment.
This allows engineers not familiar with hardware details to take advantage of the platform.
Wold, Alexander. Accelerating power quality signal analysis in a field programmable gate array. Hovedoppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9668
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Wold, Alexander&rft.title=Accelerating power quality signal analysis in a field programmable gate array&rft.inst=University of Oslo&rft.date=2007&rft.degree=Hovedoppgave
URN:NBN:no-14923
58609
070854181
Accelerating power quality signal analysis in a field programmable gate array
oai:www.duo.uio.no:10852/9296
2013-03-12T07:58:00Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
This thesis is based on my participation in a globally distributed development process of an Open Source Health Information System (DHIS-2). The DHIS-2 project is a collaboration between the countries of Norway, Vietnam, India and South Africa, all members of the larger Health Information Systems Program (HISP) network. The DHIS-2 project aims to develop a sustainable system that can be locally adapted the individual HISP countries as well as scale throughout the larger HISP network.
My task in this bigger setting was to participate on the development of one of the system's modules, a report designer. I would work together with Vietnamese students on this task, both remotely from Norway and through performing action research during a case study in Vietnam.
Based on my research findings I look into the extensive use of OSS in the DHIS-2 project, both as tools used to aid in the development process, and as software integrated into the DHIS-2 system. Further more, I discuss the current DHIS-2 approach against HISP's strategies and philosophies in relation to IS development and implementation. Finally, I investigate in what areas the DHIS-2 project affect developing countries in general, and Vietnam in particular.
Mangset, Lars. DHIS-2 - A Globally Distributed Development Process. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9296
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Mangset, Lars&rft.title=DHIS-2 - A Globally Distributed Development Process&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-10640
28248
051067277
DHIS-2 - A Globally Distributed Development Process : Performing Action Research In Vietnam
oai:www.duo.uio.no:10852/9434
2017-12-07T13:05:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
The heavy load and rich variety of data on the Internet has resulted in the need to gain an understanding of the characteristics of the traffic to better plan, develop, and implement new network devices, applications, and protocols. In order to obtain such knowledge, network monitoring is becoming more and more important. However, tools available for network monitoring are restricted to either offline analysis in DBMSs or online analysis through hard-coded continuous queries. Many streaming applications would benefit from a system where network monitoring queries effectively can be inserted, deleted, modified, and processed online in a continuous and real time manner. Data Stream Management System (DSMS) is a promising technology with respect to the needs of network monitoring, because it is designed to meet the above requirements generated by many streaming applications. In the present paper, an experimental analysis of STREAM as a network monitoring tool is performed. STREAM is a general-purpose DSMS and its continuous query language is known as CQL (Continuous Query Language). We investigate whether the current implementation of CQL operators provides us the possibility of expressing a wide-ranged set of network monitoring queries. Furthermore, STREAM's performance is measures by accomplishing several experiments that are processed online over real network traffic.
Results reveal that STREAM can handle network loads up to 30 Mb/s for simple queries, and up to approximately 3 Mb/s complex queries. When queries are executed concurrently, STREAM can handle network loads up to approximately 2.5 Mb/s, strongly depending on the complexity and number of queries.
STREAM provides a well-sized set of operators that provides us the possibility of expressing many types of queries. However, network monitoring queries are restricted by lack of specific network data types and operators. Consequently, these queries are expressed in cumbersome ways. STREAM manages to process network monitoring queries online in a continuous manner, nevertheless at a very limited network load. Thus, the applicability of STREAM as a network monitoring tool is restricted.
Hernes, Kjetil Helge. Design, Implementation, and Evaluation of Network Monitoring Tasks with the STREAM Data Stream Management System. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9434
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Hernes, Kjetil Helge&rft.title=Design, Implementation, and Evaluation of Network Monitoring Tasks with the STREAM Data Stream Management System&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12353
40124
060953888
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9434/2/hernes.pdf
Design, Implementation, and Evaluation of Network Monitoring Tasks with the STREAM Data Stream Management System
oai:www.duo.uio.no:10852/9983
2017-12-07T13:05:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Denne hovedoppgaven fokuserer på hvordan man kan anvende modelleringsspråket UML og UMLs verktøy til å modellere informasjonsmodeller i SERES. SERES fokuserer på verktøy og metodikk for å understøtte semantisk interoperabilitet/samvirke og innebærer at man skal kunne tilby flerbruker, grafisk modelleringsverktøy. UML er valgt som modelleringsverktøy i SERES prosjektet men endelig verktøy er ikke valgt og vi har derfor sett på andre alternativer til modelleringsspråk som Web Ontology Language (OWL) for å se hvilket språk som er best egnet til SERES prosjektet.
Vi har i denne oppgaven benyttet oss av konkrete skjemaer, blankett skjemaer for å modellere informasjonsmodeller i SERES. Målet med dette er at en etat skal bruke data fra andre etaters skjemaer helle enn å spørre om igjen selv. Enkel gjenbruk krever at svarene oppbevares med en forklaring på hva enkelte svarfelt betyr.
Det fastsettes derfor krav til modelleringsverktøyet som blir valgt. Språket er nødt til å kunne gi en overordnet og detaljert presentasjon av informasjonsmodellene, samtidig som semantikken oppbevares.
Gjenbruk er en viktig del i denne oppgaven da etatene skal bruke data fra andre etaters skjemaer og det er derfor veldig viktig at modelleringssverktøyet som blir valgt gjennomgåes med fokus på hvordan gjenbruk kan uttrykkes ved hjelp av modelleringsspråket. UML er for eksempel ett språk som er godt vurdert til å ha godt potensiale til å kunne uttrykke gjenbruk og vi vil undersøke dette nærmere på både UML og OWL nivå.
Kardos, Morten, Aatif, Rida, . Anvendelse av OWL og UML med SERES. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9983
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Kardos, Morten&rft.au=Aatif, Rida&rft.title=Anvendelse av OWL og UML med SERES&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21032
85851
081440073
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9983/4/kardos.pdf
Anvendelse av OWL og UML med SERES
oai:www.duo.uio.no:10852/9744
2017-04-06T22:15:28Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
The contents of the work may be divided in two parts. In the first part we deal with the computation of the arc-lengths of curves and the areas of surfaces. The second part considers the problem of reparametrizing a parametric surface, in such a manner that the new parametrization is close to conformal.
The first part.
In computational geometry, one often needs to calculate length, angle, area and other intrinsic quantities. In themselves, they are interesting because they give information about the geometric object we are studying. They are also essential in almost every geometric computation or algorithm, from curve interpolation to texture mapping.
Three different ways to compute length and area are investigated, all of which share the property that they only require point evaluations and not derivatives of the given curve or surface.
The first method is based on interpolating the curve or surface by a polynomial and using numerical integration to approximate the length or area of the polynomial. By this process we are able to obtain rules of arbitrary order. Compared to traditional methods, we require one less degree of smoothness for the same order of accuracy.
The second method is based on using Richardson extrapolation to build high-order rules from simpler rules, starting with the chord-length method for curves, and the so-called 'diagonal' method for surfaces. A central issue is proving that these rules have proper error expansions in powers of h. Rules of arbitrary order may be constructed by schemes such as Romberg's.
The third method is similar to Clenshaw-Curtis quadrature in that it uses Chebyshev polynomials via the FFT for the interpolation. We also employ the FFT in order to compute the necessary derivatives of the obtained polynomial. Again, we obtain rules of arbitrary order.
The second part.
The first paper investigates the Laplace-Beltrami equation on a parametric surface, and shows several properties of its solution using PDE theory. It is shown that the so-called 'discrete harmonic' or 'cotangent' weights that have been suggested for parametrization purposes may be viewed as arising from a FEM analysis. This is obtained by using linear elements and a particular point-based quadrature to approximate the bilinear form. It is shown that in the L2-norm, this method is of second order. Also, quadratic elements are investigated and shown to have substantial accuracy advantages in examples.
The second paper applies the knowledge and FEM method investigated above to the issue of reparametrization of parametric surfaces. A reparametrization is sought such that the boundaries are parametrized by scaled arc-length, while in the interiour it solves the Laplace-Beltrami equation in both components, i.e. it is a harmonic map. This is shown to give parametrizations that are well suited for purposes such as interpolation, intersection, closest point computation and gridding.
Rasmussen, Atgeirr Flø. Some problems in computational geometry related to intrinsic properties of curves and surfaces. Doktoravhandling, University of Oslo, 2007
http://hdl.handle.net/10852/9744
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Rasmussen, Atgeirr Flø&rft.title=Some problems in computational geometry related to intrinsic properties of curves and surfaces&rft.inst=University of Oslo&rft.date=2007&rft.degree=Doktoravhandling
URN:NBN:no-15085
63008
070896194
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9744/1/610_Rasmussen_DUO.pdf
Some problems in computational geometry related to intrinsic properties of curves and surfaces
oai:www.duo.uio.no:10852/9621
2015-02-13T05:01:42Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
1979
I made the first implementation and wrote the original MVC reports while I was a visiting scientist at Xerox Palo Alto Research Laboratory (PARC) in 1978/79. MVC was conceived as a general solution to the problem of users controlling a large and complex data set.
This document contains a scan of the two original MVC reports dated 12 May and 10 December 1979.
http://hdl.handle.net/10852/9621
URN:NBN:no-14314
52648
070188149
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9621/1/Reenskaug-MVC.pdf
The original MVC reports
oai:www.duo.uio.no:10852/9730
2017-12-07T13:05:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
The Internet is nowadays used for a diverse set of services. Some of them have stringent demands regarding latency, data rate, jitter, etc. Some examples of applications which make such demands are Voice over IP, IP TV, telemedicine, stock exchange information, gaming, etc. In general the diversity of tasks being solved by using the Internet as the communication medium continuously increases the demand availability of connections which is the focus of this thesis.
This thesis focuses on discussing and evaluating recovery schemes that fit the IP Fast Reroute Framework (IPFRR) developed by the IETF Routing Area Working Group. IPFRR is a good framework for recovery schemes that provide fast reroute in connectionless IP networks. Two recovery schemes have been evaluated, namely IP Fast Reroute Using Not-via Addresses (Not-via) and Failure Insensitive Routing (FIR).
The concept of the Not-via recovery scheme is that whenever a link fails, it will consider the neighboring node as down. It will then find a path towards the next-next-hop towards the destination from the path in the failure free case.
The FIR scheme on the other hand is able to infer network failures by looking at the flight of a packet. The flight of a packet refers to the path it takes through the network. Based on this the scheme is able to proactively create forwarding tables that makes sure that traffic never traverses failed network elements.
To be able to compare performance of the recovery schemes in both real life and synthetic networks, a routing simulator was developed. It was used to show that for most networks, FIR will provide shorter recovery paths than Not-via. Since longer recovery paths leads to more links beeing used for recovery traffic, Not-via will introduce a larger amount of load than FIR. However, an important feature of the Not-via scheme is that it is able to recover from both node and link failures, whereas the FIR scheme is only able to recover from link failures.
It has found that IP Fast Reroute using Not-via addresses would probably be a better choice if a scheme should be implemented in hardware since it has less requirements for doing so. This scheme also has better coverage than the Failure Insensitive Routing scheme and as such would be a better choice for any network operator.
Nyquist, Tommy Andre. Evaluating Local Proactive Recovery Schemes for IP Networks. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9730
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Nyquist, Tommy Andre&rft.title=Evaluating Local Proactive Recovery Schemes for IP Networks&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-15075
62549
070894094
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9730/3/Nyquist.pdf
Evaluating Local Proactive Recovery Schemes for IP Networks
oai:www.duo.uio.no:10852/8896
2017-12-07T13:05:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
In light of findings indicating that improved access to relevant information is crucial to reduce risk during emergency response, research has emphasized the need for decision support tools capable of aiding on-scene emergency personnel. Motivated by the latter, the aim of this thesis is to advance an understanding of how visualization can be used as a means to communicate risk information to operative leaders working in emergency situations. In particular, we identify the needs of operative leaders regarding access to- and communication of risk information; formulate requirements that solutions for visualization of risk should conform to in order to meet these needs; develop a solution for visualization of risk that satisfy these requirements; and evaluate this solution with respect to the previously identified needs. The identification of the needs of operative leaders is based on an empirical analysis and review of relevant research. The evaluation of the solution is based on an analytical walkthrough inspection with operative leaders, and comprehensive usability testing. The findings from the research indicate that operative leaders, in order to make sound decisions during emergency situations, often need to understand the underpinning cause of risk, and the location and nature of the physical objects posing risk. The findings also show that geospatial visualization is an efficient and effective means for generating such insight, enabling operative leaders to prioritize between risk objects related to an incident, and to rationalize why some risk objects are more critical than others.
Eide, Aslak Wegner. Visualization of Risk as Decision Support in Emergency Situations. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8896
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Eide, Aslak Wegner&rft.title=Visualization of Risk as Decision Support in Emergency Situations&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-28118
120175
114802718
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8896/1/Eide.pdf
Visualization of Risk as Decision Support in Emergency Situations
oai:www.duo.uio.no:10852/9991
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Automatic discovery of services and resources is a crucial feature to achieve the expected user-friendliness in Mobile Ad-hoc Networks (MANETs). Due to limited computing power, scarce bandwidth, high mobility and the lack of a central coordinating entity, service discovery in these networks is a challenging task.
In this thesis, I have developed a service discovery protocol (Mercury) utilizing a combination of different optimization techniques: The performance is increased using cross-layer interaction between the application layer and the routing layer. The service information is described using Bloom filters and distributed using Optimized Link State Routing (OLSR). A caching scheme is implemented to obtain further reductions of both overhead and latency.
The analysis and simulation results show that the service discovery proposal induces very low overhead to OLSR and is superior to application-layer solutions. The proposal is implemented as a plugin to the OLSR implementation olsrd for real-world deployments.
Flathagen, Joakim. Service Discovery in Mobile Ad-hoc Networks. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9991
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Flathagen, Joakim&rft.title=Service Discovery in Mobile Ad-hoc Networks&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21035
86496
091919797
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9991/2/flathagen.pdf
Service Discovery in Mobile Ad-hoc Networks
oai:www.duo.uio.no:10852/8985
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2011
I terminologien brukt innen datasikkerhet kan en honeypot beskrives som en informasjonsressurs hvor verdien ligger i uautorisert eller ulovlig bruk av den ressursen. Den kan brukes til å skjerme produksjonssystem eller samle informasjon om angripere, deres angrepsmetoder og angrepsmønstre. Denne informasjonen kan til gjengjeld brukes til å utbedre forsvarsmekanismer enten av sikkerhetspersonell eller systemadministratorer. En honeypot er et sikkerhetsverktøy hvor en av intensjonene kan være å mitigere risiko i en organisasjon. Men; kan honeypots selv introdusere risiko til et organisasjonelt miljø?
In the computer security terminology a honeypot can be described as an information system resource whose value lies in unauthorized or illicit use of that resource. It can be used to divert attackers away from production systems, as well as collect information about them,
their attack patterns and methods. This information can in turn be used to improve protection mechanisms either by security professionals or system administrators. A honeypot is a security tool where one of the intentions can be to help mitigate risk in an
organization. However; honeypots themselves may introduce risk to the organizational environment that must be taken into consideration before deploying a honeypot.
Tjelta, John Børge. Honeypots in network perimeter defense systems. Masteroppgave, University of Oslo, 2011
http://hdl.handle.net/10852/8985
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Tjelta, John Børge&rft.title=Honeypots in network perimeter defense systems&rft.inst=University of Oslo&rft.date=2011&rft.degree=Masteroppgave
URN:NBN:no-30826
135795
121660745
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8985/3/Tjelta.pdf
Honeypots in network perimeter defense systems
oai:www.duo.uio.no:10852/9903
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
New technology, especially mobile phones and the Internet, have an increasing influence on society. These new technologies are utensils that are becoming progressively more ubiquitous and accessible for the masses. This gives the opportunity for a growing number of people to produce and publish; the users are also becoming the creators.
The aim of this thesis is to discuss user created content as aid in the creative process. The fundamental focus is on how the creative process unfolds when users are presented with the opportunity to create and share their own content. It also looks at what effects that user created content, generated with mobile phones and shared over the Internet, can have on the creative process and the opportunity it presents for new creative thinking on the subject of cultural heritage.
To order to do this, the technologies along with their history and present day uses are presented in-depth. The phenomenon of user created content is introduced and the process of user created content explored. Existing user created content and its creators, as well as framework for thinking about creativity are presented.
Two case studies were conducted to be able to explore this in real-life, one in a museum setting and the other at a youth club. Both present teenagers with the opportunity to express themselves in regards to their cultural heritage.
Fahle, Ine. User created content as aid in the creative process. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9903
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Fahle, Ine&rft.title=User created content as aid in the creative process&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19169
75973
080982468
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9903/2/Fahle.pdf
User created content as aid in the creative process
oai:www.duo.uio.no:10852/9880
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
The objective of the SICS Java Port Project is to create tools and methodology to translate a large financial software application from IBM VisualAge for Smalltalk to Java. My main motivation for this report is to present some of the major technical issues we have encountered as part of translation, present alternatives and the solution we chose. I would like to share some of the knowledge the project group has accumulated over the last four years.
Each problem will be briefly demonstrated, followed by a short discussion and the solution we ended up with and a demonstration of the results.
In our concrete project, the approach has been highly successful. We have found ways to overcome the most important language differences. The few problems that we have not been able or willing to solve have fortunately been limited to a manageable number of occurrences.
Although it is impossible to directly apply the experiences of this project to other Smalltalk systems, the results should be promising for others with a similar challenge at hand. Hopefully the reader will find the discussions relevant, and if not immediately reusable, at least serve as inspiration for developing custom solutions of their own.
Skarsaune, Martin. The SICS Java Port Project. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9880
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Skarsaune, Martin&rft.title=The SICS Java Port Project&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19164
74031
080981151
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9880/1/Skarsaune.pdf
The SICS Java Port Project : automatic translation of a large system from Smalltalk to Java
oai:www.duo.uio.no:10852/9658
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
Object oriented middleware provides an application with the
possibility of distributing objects to multiple nodes in a distributed
system. In this thesis, we have developed a middleware that, in
addition to distributing objects, makes it possible to migrate
them. As such, it becomes possible to dynamically relocate objects
based on the requirements of the application. We use a distributed
name service to maintain references to objects, which means that
any given object is managed by the node it is currently located at. This
middleware was derived from the requirements and
characteristics displayed by interactive real-time applications, i.e,
applications that are time dependent and event based. To
demonstrate the usability of the middleware we have implemented a test
application, in form of a chat system derived from a massively
multi-player online game (MMOG).
Beskow, Paul. Migration of Objects in a Middleware for Distributed Real-Time Interactive Applications. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9658
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Beskow, Paul&rft.title=Migration of Objects in a Middleware for Distributed Real-Time Interactive Applications&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-14913
58108
070813744
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9658/3/Beskow.pdf
Migration of Objects in a Middleware for Distributed Real-Time Interactive Applications
oai:www.duo.uio.no:10852/9370
2014-12-26T05:04:08Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
De håndskrevne dagbøkene begynner å bli mindre og mindre populære nå som dagens teknologier klarer å gjøre det samme automatisk uten at vi trenger å tenke over det. Spesielt mobiltelefoner er blitt en av de største vinnerne innen teknologiutviklingen. Mobiltelefoner brukes ofte til enkle opptak og lagring av kommunikasjonsdata som vi kanskje ønsker å huske på, enten det er bevisst eller ikke. Ofte hender det at man glemmer ting som er blitt gjort, sagt, sett eller hørt; særlig spesifikke detaljer i for eksempel et bestemt sted, tid, begivenhet eller omgivelse. I utgangspunktet bruker man mobiltelefoner som en kommunikasjonsenhet. Men den kan også brukes til andre formål slik som det lille ekstra hjerne, en personlig assistent, en påminnelsesenhet, et lagringsmedia eller en elektronisk automatisert dagbok.
Denne oppgaven omhandler begrepet livslogg og hvordan man ved hjelp av en mobiltelefon kan ta opp ting som kan være hensiktmessig å huske og finne igjen i etterkant. Dette kan være veldig nyttig for eksempel for forretningsfolk eller privatpersoner som glemmer detaljer og ikke alltid greier å huske på en spesifikk detalj, noe som kanskje kan være veldig viktig til senere bruk. Den kan også være til stor nytte for personer med nedsatte funksjonsevner som for eksempel folk som lider av hukommelsestap, hjerneskadde eller andre lidelser og forstyrrelser. Det er derfor blitt forsket og utviklet en rekke systemer hvor temaet logging har vært sentralt.
Oppgavens formål er å finne en måte å sette sammen teorier som kontekst, informasjons-gjenfinning og mobil logging, med andre ord finne hvilken måte data bør fremstilles og hvordan man enkelt kan bla igjennom dem med hensyn til brukerens aktuelle handling. Oppgaven legges spesielt vekt på:
- Å illustrere hvordan kontekst informasjon kan bidra og fremme brukerens behov til en hver tid
- Å definere systemkrav gjennom scenarier og teoretiske begreper
- Å definere potensielle problemer ved design av systemet.
Metoden jeg har brukt er å gå gjennom og observere tidligere arbeid og å studere de nødvendige litteraturene og dokumentasjonene i tidligere arbeid som angår temaet mobil logging. På bakgrunn av dette forsøkte jeg å identifisere problemstillingen min. Jeg har også tatt kontakt og utvekslet epost med utviklere av tidligere systemer for å forstå dem bedre og samtidig testet ut enkelte systemer.
Dersom jeg sammenligner de fire definisjoner av kontekst med denne oppgaven, kan jeg oppdage alle likhetene mellom de definisjonene som dem har definert:
- Maskin kontekst (mobiltelefonens nettverkssforbindelser, kommunikasjonskostnader, PC synkroniseringer og web-opplasting)
- Bruker kontekst (lokasjon og sosiale omgivelser)
- Tidskontekst (tiden på mobiltelefonen)
- Fysisk kontekst (lyd)
Mobil logging systemet er også relatert til kontekst informasjon ved at den presenterer informasjon og tjenester, eksekverer tjenester og sist men også viktigst logger aktuell informasjon for senere gjenfinning. Dette viser at man til en hver tid kan registrere kontekst informasjon på mobiltelefonen, uansett om det fungerer aktivt eller passivt. Dermed er kontekst en integrert del i et slikt system.
Når det gjelder søking er dette et fenomen som er blitt mer og mer populær. Slik som Google, og mange av de andre søkemotorene, er det to prinsipielle måter å leter etter informasjon på:
• Søking – Å søke etter informasjon ved hjelp av gitte søkeord og kriterier.
• Navigering – Å søke etter informasjon ved å navigere gjennom relevante sider.
Forskjellige mennesker har forskjellige behov for å søke etter relevant informasjon. For dem som ikke vet hva de leter etter foretrekker å navigere for å lete, mens andre bruker søk nettopp fordi de vet hva slags informasjon de skal ha.
I en mobiltelefon kan man i tillegg få relevant informasjon uten interaksjon med bruker ved hjelp av kontekst informasjon. Dette ble eksemplifisert ved bruk av celle ID for lokasjon. Brukeren vil få relevant informasjon i hensiktsmessige situasjoner. I noen tilfeller bruker man søking for å lete etter informasjon på mobiltelefonen. Andre ganger kan det være fornuftig å bruke navigasjonsmetoden for å navigere seg frem til relevant informasjon.
Mai, Vu Phi. Mobiltelefonen. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9370
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Mai, Vu Phi&rft.title=Mobiltelefonen&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-11492
34402
060062827
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9370/1/34402.pdf
Mobiltelefonen : en elektronisk automatisert dagbok
oai:www.duo.uio.no:10852/34823
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2012
Countless applications use the propagation and reflection of sound to gain better knowledge of the surrounding medium. This medium can, for instance, be made of a set of complex and heterogeneous biological tissues or of ships in the sea several kilometres away from the sound receiver. In all cases, the sound propagation is affected by some nonlinear effects. In many applications those effects are neglected, while in others they are exploited.<br><br>
In this thesis we investigate the possibility of using the nonlinear effects in fields where they are avoided, neglected, or overseen. We also try to establish faster or more accurate estimations of nonlinear sound fields. The two domains that were investigated are the domain of underwater acoustics with applications such as echo sounders or acoustic Doppler current profilers, and the domain of medical imaging.<br><br>
In underwater acoustics, we studied the combined use of the second harmonic and fundamental signals for imaging using a scientific echo sounders and for determining current velocities using acoustic Doppler current profilers. We show that the use of the second harmonic signal can improve the performance in these applications when the range is limited.<br><br>
In medical imaging, we investigated the use of the second harmonic signal with the multi-line transmission technique. In this case too, images produced by the second harmonic signal suffer from less perturbations than images produced by the fundamental signal.<br><br>
We have developed new models to estimate the nonlinear propagation of sound. One model intends to appropriately describes the attenuation and the dispersion observed in complex media. It derives a wave equation with a loss operator defined by fractional order derivatives. The model relies on variations of the constitutive equations that adequately describe the stress-strain relation and heat transfer. The other models based on the quasi-linear approximation intend to speed up or increase the flexibility of the implementation. They proved in one case to be faster than other state-of-the-art simulators, and in the other case, more flexible than alternative methods. Given that the conditions for quasi-linear propagation are satisfied, those simulators adequately describe the sound field for the fundamental and second harmonic signals.
Prieur, Fabrice. Nonlinear propagation of ultrasonic signals. Doktoravhandling, University of Oslo, 2012
http://hdl.handle.net/10852/34823
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Prieur, Fabrice&rft.title=Nonlinear propagation of ultrasonic signals&rft.inst=University of Oslo&rft.date=2012&rft.degree=Doktoravhandling
URN:NBN:no-33588
176804
Fulltext https://www.duo.uio.no/bitstream/handle/10852/34823/3/dravhandling-prieur.pdf
Nonlinear propagation of ultrasonic signals : theoretical studies and potential applications
oai:www.duo.uio.no:10852/9447
2017-12-07T13:05:10Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
En utfordring med å lage IT-systemer er å lage dem slik at de blir
best mulig tilpasset til hvordan virksomheten fungerer. Hvordan
virksomheten fungerer beskrives gjennom dens
forretningsprosesser, organisasjonsstruktur, ressurser og
forretningsregler. For å tilpasse et IT-system til en bedrift kan det
være en god hjelp å lage modeller av virksomheten, i hvert fall av den
delen av virksomheten hvor IT-systemet vil ha en innvirkning, eller de
delene av virksomheten som påvirker IT-systemet. Det å gjøre
virksomhetsmodellering er noe som har eksistert lenge, og det finnes
utallige metoder og modelleringsspråk for å lage en modell av
virksomheten. Disse modellene har til nå ikke blitt brukt noe særlig
videre for å spesifisere arkitekturen til et IT-system.
Forretningsregler må håndteres i et IT-system, fordi dette er regler
som endres ofte. Spørsmålet er bare hvordan denne håndteringen kan
effektiviseres. Ved å separere reglene i en egen komponent vil
regelhåndteringen effektiviseres fordi en komponent er enklere å
vedlikeholde enn å finne igjen alle reglene innbakt i andre
komponenter. Forretningsregler skal identifiseres
på virksomhetsmodelleringsnivå, fordi kunnskapen om disse reglene er
det forretningsfolk som sitter inne med.
Denne oppgaven ser på hvordan virksomhetsmodellering kan brukes for å
spesifisere IT-systemer og hvordan forretningsregler kan skilles ut
som en egen komponent. Først evalueres eksisterende løsninger for å
se om det er mulig å bruke dem til dette. Deretter introduseres en metode
som kan møte de utfordringene som beskrevet innledningsvis, i den grad
eksisterende løsninger ikke strekker til.
I oppgaven er det evaluert ulike metoder og modelleringsspråk for å
lage virksomhetsmodeller, to språk for å gjøre modelltransformasjoner,
og i tillegg er det sett på noen metoder for å lage
arkitekturmodeller. Metoden Regel- og modelldrevet metode (REMO) som
er introdusert foreslår en måte å transformere deler av
virksomhetsmodellen til arkitekturmodell.
Løland, Unni. Virksomhetsmodellering som basis for spesifikasjon av IT-systemer. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9447
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Løland, Unni&rft.title=Virksomhetsmodellering som basis for spesifikasjon av IT-systemer&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12364
40547
060954531
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9447/1/Loland.pdf
Virksomhetsmodellering som basis for spesifikasjon av IT-systemer : Et metodeforslag
oai:www.duo.uio.no:10852/8935
2013-03-12T07:58:18Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
In this paper we investigate the inconsistency which occur when a curve network is integrated in a terrain surface. Terrain and network data are important geographical data, but these data sets are traditionally maintained in separate systems. This makes integration complicated even though certain networks as water and road network constrain the terrain. The reason of the inconsistency is lack of or unreliable data. E.g. height information in the network data will usually be missing.
The first problem investigated is the topological consistency; the network should be integrated in the terrain model. The terrain is represented as a constrained Delaunay-triangulation, and the network as a planar graph. The basis for the integration is that the triangulation also can be represented as a planar graph. In addition to the topological constraint, each network has specific geographical constraints. E.g. a water network may have three constraints with respect to the terrain; rivers shall decrease monotonically from mountain to sea level, lakes shall be flat and rivers shall run in the bottom of valleys. These issues will be visualized and analysed and some possible methods for solving the problems presented.
Kristoffersen, Anja Kristine. Høydemodeller med nettverksføringer - visualisering og analyse. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/8935
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Kristoffersen, Anja Kristine&rft.title=Høydemodeller med nettverksføringer - visualisering og analyse&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-6881
12877
030904528
Høydemodeller med nettverksføringer - visualisering og analyse
oai:www.duo.uio.no:10852/9163
2014-12-26T05:00:39Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
Tre tilnærminger til design av mobil IT: Brukervennlighet, brukbarhet
og brukerfokus, handler om ulike tilnærminger til "usability
engineering" og hvilke implikasjoner disse har for design av mobil IT.
Målet med oppgaven er å se på hvilke måter alternative tilnærminger
til design kan hjelpe oss å lage bedre applikasjoner for mobil IT.
Oppgaven presenterer tre perspektiver på design, representert ved tre
forskningstradisjoner; brukervennlighet representert ved tradisjonell
usability engineering, eller det vi kan kalle amerikansk tradisjon,
brukbarhet representert med skandinavisk tradisjon og brukerfokus
representert ved en tradisjon som har en noe filosofisk tilnærming til
hvordan brukere opplever teknologi.
Oppgaven omfatter også tre ulike empiriske undersøkelser som på hver
sin måte gir innsikt i brukbarhetsaspekter ved mobil IT. En studie av
programvare for skipsinspektører for PDA utviklet i Det Norske
Veritas, et eksperiment som sammenlikner innlegging av data på PDA og
PC og intervjuer med en bruker og utviklere av programvare for mobil
IT.
Oppgaven konkluderer med at perspektivet kalt brukerfokus kan fungere
som en utfyllende dimensjon ved design av programvare. Til slutt
presenteres et forslag til hvordan man kan anvende dette i en praktisk
sammenheng.
Holten, Jonas Båfjord. Tre tilnærminger til design av mobil IT: Brukervennlighet, brukbarhet og brukerfokus. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/9163
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Holten, Jonas Båfjord&rft.title=Tre tilnærminger til design av mobil IT: Brukervennlighet, brukbarhet og brukerfokus&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-9898
20656
042005817
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9163/1/HovedfagJonas.pdf
Tre tilnærminger til design av mobil IT: Brukervennlighet, brukbarhet og brukerfokus
oai:www.duo.uio.no:10852/8934
2020-03-12T23:31:09Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
This thesis investigates different memory types for use as a synaptic storage in a neuromorphic application for on-chip learning. Our main concern was to find a suitable implementation four this purpose. We were looking for a memory element which could be used as a distributed storage with no external control signals or backup. This memory should preferably be analog, which excludes the common digital storage techniques such as latches and flip-flops. Dynamic multi-level or analog memory will also be insufficient, since it requires an external storage with AD/DA converters to preserve its multi-level or analog value. Furthermore, the value stored should be easily altered. Previous work has used floating gate(FG), which has many advantages in a neuromorphic design, i.e. permanent storage, slow learning and infinite resolution. However, there are severe device property mismatches and specialized initialization and programming techniques are required to alter the value stored. After initial investigation and searches for relevant implementation, no optimal solutions where found, and we decided to test a novel memory element which is presented in this thesis. It is a multi-level memory, which stores an analog value over a short period of time. The memory preserves its own state through a local feedback path, and does not require external control signals. The value is easily altered by directly applying a voltage, or as done in this thesis, by injecting or drawing a current.
We start with an introduction to neuromorphic electronics and different analog an multi-level memory. We then present the objective of the thesis and basic theory. Next, a presentation of the different circuit componenets with test results are given. Last, the implemenetation is discussed and future work is proposed.
Riis, Håvard Kolle. Multi-level static memory for on-chip learning. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/8934
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Riis, Håvard Kolle&rft.title=Multi-level static memory for on-chip learning&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-6880
12875
031724507
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8934/1/hoppgavehkriis.pdf
Multi-level static memory for on-chip learning
oai:www.duo.uio.no:10852/9446
2017-12-07T13:05:11Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
This thesis is part of a larger study of the advantages of using
colored Petri nets as a modeling language for railway systems. The
railroad layout serves as a specification layer, and basic railroad
components are constructed using colored Petri nets. These basic
railroad components corresponds to physical railroad components like
turnouts, crossings, slips, etc. Then large scale colored Petri net
models of the entire subway system is generated automatically. The
colored Petri nets are then automatically translated to Maude code.
In this thesis we have expanded the algebra of the specifications, and
introduced railroad nets as an extension of colored Petri nets.
The colored Petri net components are refinable. This makes it possible
to simulate different behaviors. Several different refinements are
presented in this thesis. A tool for railroad simulation is also presented.
Bjørk, Joakim. Executing Large Scale Colored Petri Nets by Using Maude. Hovedoppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9446
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Bjørk, Joakim&rft.title=Executing Large Scale Colored Petri Nets by Using Maude&rft.inst=University of Oslo&rft.date=2006&rft.degree=Hovedoppgave
URN:NBN:no-12363
40542
060954450
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9446/1/Bjork.pdf
Executing Large Scale Colored Petri Nets by Using Maude : Simulating Railroad Systems
oai:www.duo.uio.no:10852/9676
2017-12-07T13:05:11Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
I denne oppgaven ser jeg på hvordan klinikere på sykehus kommuniserer når de er rundt på sykehuset. Gjennom feltobservasjon og intervju har jeg kartlagt klinkernes behov for kommunikasjon når de er mobile og hvilke teknologi og metoder de i dag benytter seg av. På bakgrunn av dette har jeg sett på introduksjon av en mobil kommunikasjonsenhet for klinikere og hvilke funksjonalitet og egenskaper denne burde inneholde. Oppgaven presenterer en prototype på en mobil enhet og diskuterer enhetens funksjonalitet opp mot de empiriske resultatene, samt tidligere undersøkelser og teori på området.
Studiet viste at dagens kommunikasjonsløsning skaper flere unødvendige avbrytelser i klinikernes arbeid. Dagens løsning tilbyr heller ingen form for informasjon om hvem som tar kontakt, hvorfor de tar kontakt eller hva de tar kontakt om. Prototypen som ble utviklet viste at ved å innføre informasjon om brukerens tilgjengelighetsgrad og informasjon om hva brukeren holder på med, kan klinikerne tilpasse sin kommunikasjon etter mottakerens tilgjengelighet og handlinger.
Ved å tilby flere kommunikasjonsmåter, kan klinikerne begrense bruken av direkte kommunikasjon som er avbrytende i klinikernes arbeide. Prototypen har innført tekstlige beskjeder med prioritering og kvitteringsfunksjonalitet. Funnene i empirien viste at klinikerne ofte kunne benytte seg av tekstlige beskjeder fremfor direkte kommunikasjon.
Knudsen, Pål Vermund. Bruk av mobile enheter for kommunikasjon på sykehus. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9676
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Knudsen, Pål Vermund&rft.title=Bruk av mobile enheter for kommunikasjon på sykehus&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-14924
58888
070827346
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9676/3/Knudsen.pdf
Bruk av mobile enheter for kommunikasjon på sykehus
oai:www.duo.uio.no:10852/10032
2014-12-26T05:04:15Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
In 2001 and 2002 a wave of corporate and accounting scandals became known to the public. As a direct consequence of these frauds the Sarbanes - Oxley Act of 2002, also known as the Public Company Accounting Reform and Investor Protection Act of 2002, was signed in to law. The main focus of Sarbanes - Oxley compliance is to ensure the accuracy of financial reporting and the systems that support this data. The law directly affected all US public traded companies and was costing millions to comply with. These costs led the European public companies to consider unlisting from the American stock market, not knowing that a European version (The 8th Company Law Directive) of the Act would come into force four years later. This project will focus on the comparison of these two laws using promise theory as a model to better see the similarities and differences and understand the relationship between the affected parties of both laws in the eyes of promises. We will finally relate the Sarbanes - Oxley to technology, more specifically policy based configuration management.
Dagnew, Iman. Sarbanes - Oxley Act of 2002 vs. The 8th Company Law Directive. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10032
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Dagnew, Iman&rft.title=Sarbanes - Oxley Act of 2002 vs. The 8th Company Law Directive&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21403
89358
091977894
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10032/1/Dagnew_Iman.pdf
Sarbanes - Oxley Act of 2002 vs. The 8th Company Law Directive
oai:www.duo.uio.no:10852/10031
2014-12-26T05:04:16Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
With the multiplicity usage of computer networking devices called router, it is becoming common practice for everybody who would like to be online making this technology be the most responsible for allowing one of the 20th century’s greatest communications developments, the internet, to exist and become very popular in these days. Network management is important and necessary when dealing with a load of routers from different manufacturers because they have very different configuration languages which are proprietary and completely separate from server configuration. To discover whether these incompatible languages can be unified into a single open standard that can be integrated into servermanagement by using promise theory is our goal. This thesis considers both practical and theoretical parts. It consists of building a linux router, modeling a set of routing configurations using promise theory and designing a set of promises for cfengine 3 which can configure the router directly from the cfengine 3 promise language.
Phooripoom, Nakarin. A Promising Cfengine Linux Router. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10031
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Phooripoom, Nakarin&rft.title=A Promising Cfengine Linux Router&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21402
89357
091977703
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10031/1/Phooripom_Nakarin.pdf
A Promising Cfengine Linux Router
oai:www.duo.uio.no:10852/10030
2014-12-26T05:04:16Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Cluster became main platform as parallel and distributed computing structure for high performance computing. Following the development of high performance computer architecture more and more different branches of natural science benefit fromhuge and efficient computational power. For instance bio-informatics, climate science, computational physics, computational chemistry, marine science, etc. Efficient and reliable computing powermay not only expending demand of existing high performance computing users but also attracting more and more different users. Efficiency and performance are main factors on high performance computing. Most of the high performance computer exists as computer cluster. Computer clustering is the popular and main stream of high-performance computing. Discover the efficiency of high performance computing or cluster is very interesting and never enough as it is really depending on different users. Monitoring and tuning high performance or cluster facilities are always necessary. This project focuses on high performance computer monitoring. Comparing queuing status and work load on different computing nodes on the cluster. As the power consumption is main issue nowadays, our project will also try to estimate power consumption on these special sites and also try to support our way of doing estimation.
Halifu, Saerda. Investigation of Cluster and Cluster Queuing System. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10030
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Halifu, Saerda&rft.title=Investigation of Cluster and Cluster Queuing System&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21401
89356
091977592
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10030/1/Halifu_Saerda.pdf
Investigation of Cluster and Cluster Queuing System
oai:www.duo.uio.no:10852/10029
2014-12-26T05:04:16Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
In the recent years the shared services concept has become an integral part of business. These shared services can be in the form of information technology, engineering and lot more. Service providers spent huge amounts of money to build an infrastructure that can provide efficient and valued services to the customers. In IT business these services varies from providing basic consultancy and managing the IT operations of the customers to running high priority business processes,(online banking). Customers of these services pay for these services, so a mechanism of resource usage metering is required to accurately charge the users and at the same time a monitoring mechanism is required to have a check on the services being provided to the customers for any resource contention and service degradation and future capacity planning. If a service provider is unable to develop an accurate charge back and monitoring mechanism then the equation of service provider and customer becomes a point of frustration for both sides. charge back and monitoring systems developed for physical environment are not capable to measure the resource usage in virtual environment because in virtual environment (Z/VM) resources are shared between users and it becomes difficult to measure the resource usage by a specific user. Until now a few tools have been developed that provides efficient resource metering and monitoring in virtual environment (Z/VM) but every business has its own requirements and system setup so mostly these tools need some customizations to fit into the business. This work mainly concentrated on what kind of resource utilization data is available on Z/VM and on LINUX guests running on Z/VM to effectively charge the customers running there guest Linux Operating systems in virtual environment (Z/VM based) and to monitor the cpu and memory utilization to check whether the estimate of memory allocation for linux guests running different applications made by system (PWSS) is a good estimate or require some optimizations. Because memory utilization is considered more expensive in virtual environment in the context of system performance. The study also includes a comparison between this technique of charge back and some commercial products from IBM and CA (Computer Associates) that provides charge back and monitoring facility in Z/VM based virtual environment, and provides some benefits of this work in the proposed environment.
Shahzad, Kashif. A Technical Study Of Charge back And Monitoring Systems In Virtual Environment. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10029
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Shahzad, Kashif&rft.title=A Technical Study Of Charge back And Monitoring Systems In Virtual Environment&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21400
89355
091977495
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10029/1/Shahzad_Kashif.pdf
A Technical Study Of Charge back And Monitoring Systems In Virtual Environment
oai:www.duo.uio.no:10852/10028
2014-12-26T05:04:18Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
This thesis considers the administration of virtual machines on IBM mainframes running z/VM.Asolution for administrating z/VM through a Linux VMrunning on a custom designed z/VM architecture is developed and implemented. The administration tool used is a slightly expanded version of MLN. The expansions added allows MLN to utilize plugins for technology specific code. Support for z/VM are then added through the creation and introduction of a plugin containing all z/VM specific code. Results from scenarios conducted shows that the administration process can be significantly automated and abstracted from a normal z/VM perspective. Also, increased security and safety is achieved through the protective limitations and control offered by the Programmable Operator running on z/VM.
Gundersen, Marius Brath. Automation and Abstraction for Scalable z/VM Linux Administration on the zSeries Mainframe. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10028
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Gundersen, Marius Brath&rft.title=Automation and Abstraction for Scalable z/VM Linux Administration on the zSeries Mainframe&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21399
89354
091974569
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10028/1/Gundersen_Marius-Brath.pdf
Automation and Abstraction for Scalable z/VM Linux Administration on the zSeries Mainframe
oai:www.duo.uio.no:10852/8806
2015-02-13T05:04:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2010
We present a partial correctness proof system for ABS, an imperative, concurrent and object-oriented language which provides asynchronous communication model that is suitable for loosely coupled objects in the distributed setting. The proof system is derived from a standard sequential language by means of a syntactic encoding and applies Hoare rules. The execution of a distributed system is represented by its communication history, which can be predicated by history invariant. Modularity is achieved by establishing history invariants independently for each object and composed at need. This results in behavioral specification of distributed system in an open environment. As a case study we model and analyze the reader-writer example in the framework we developed.
http://hdl.handle.net/10852/8806
URN:NBN:no-26369
107630
10247060x
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8806/1/ResRep401.pdf
Observable behavior of distributed systems : component reasoning for concurrent objects
oai:www.duo.uio.no:10852/9111
2013-03-12T07:58:24Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2004
Wireless ad hoc networks are networks composed solely of wireless, arbitrary moving nodes. There are no centralized entities like servers or routers, thus all the network functionality must be provided by the nodes themselves.
These properties creates problems for several existing protocols, among them routing protocols. There have been developed several routing protocols for use in ad hoc networks, among them are the Optimized Link State Routing (OLSR) protocol.
For a network to be secure and reliable a secure routing protocol must be used. In its native form the OLSR protocol does not provide any security at all. To create a more secure OLSR protocol I propose to add a digital signature to the header of the routing packets. This signature, which can be verified by the receiving node, should bring some security into the OLSR protocol.
In this master thesis I will discuss and analyze the proposal for a more secure OLSR protocol. I will present an implemented, working prototype for this mechanism, and I will present some initial testing with this prototype.
This thesis also provide background information on topics related to the security mechanism proposed.
Engell, Sondre Wabakken. Securing the OLSR Protocol. Hovedoppgave, University of Oslo, 2004
http://hdl.handle.net/10852/9111
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Engell, Sondre Wabakken&rft.title=Securing the OLSR Protocol&rft.inst=University of Oslo&rft.date=2004&rft.degree=Hovedoppgave
URN:NBN:no-8231
16915
040212246
Securing the OLSR Protocol
oai:www.duo.uio.no:10852/9559
2017-12-15T14:18:23Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
Sammendrag
I 2001 introduserte Universitet i Oslo bruken av Learning Management Systems i undervisningen. Systemet som ble tatt i bruk ved Universitet i Oslo heter ClassFronter og er levert av Fronter A/S. Innføringen av informasjons og kommunikasjons teknologi (IKT) i undervisningen har bred politisk støtte og er et satsingsområde for norske myndigheter. Den teknologiske utviklingen har medført at stadig flere studenter tilbys undervisning over Internett. Leverandøren Fronter A/S har opplyst at det høsten 2005 er ca 20 000 brukere av ClassFronter ved Universitet i Oslo.
Det er interessant å kartlegge problemstillinger knyttet til pedagogisk bruk av IKT fordi det er en rekke ubesvarte spørsmål knyttet til emnet. Typiske spørsmål kan være: Hva er pedagogisk bruk av IKT? Er det tilstrekkelig at lærer og student kommuniserer ved hjelp av e-post? Eller forventes det at studentene ved Universitet i Oslo skal kunne gjennomføre et masterstudie, der det eneste kontaktpunktet studentene har til undervisningen er et elektronisk verktøy?
I oppgaven er det antatt at forventingene som er knyttet til introduksjonen av IKT i undervisningen, ikke har gitt de ønskede resultatene når det gjelder anvendelsen av pedagogisk metodikk. Jeg har satt lærernes arbeid med å utnytte ClassFronter i en pedagogisk sammenheng i sentrum for kartleggingen.
Først så jeg på om lærere ved Universitet i Oslo og Haugenstua ungdomsskole i Oslo oppfatter ClassFronter som et anvendelig verktøy for pedagogisk bruk av IKT. Det andre jeg kartla var om lærerne opplever at ClassFronter favoriserer eller hemmer et eller flere pedagogiske uttrykk. For det tredje har jeg stilt spørsmål om lærerne har en mental modell for å realisere undervisningen i ClassFronter. En mental modell er i denne sammenhengen en plan, et sett med regler eller et rammeverk. Det fjerde punktet har vært å kartlegge om lærerne har benyttet diskusjonsforum som del av undervisningen. Femte og siste punkt har vært å få kartlagt om lærerne deltar i et systematisk forbedringsarbeid, som har som mål å utnytte de pedagogiske mulighetene i ClassFronter bedre.
Oppgaven er basert på en spørreundersøkelse utført i høstsemesteret 2005 og i vårsemesteret 2006 blant informanter ved Universitet i Oslo, Haugenstua ungdomsskole og hos leverandøren av ClassFronter som er firmaet Fronter A/S. Det har vært allmenn enighet om at det er behov for å kartlegge pedagogisk bruk av IKT i undervisningen nøyere.
Resultatene fra denne kartleggingen lar seg ikke generalisere utover det miljøet den dekker, og de erfaringer informantene i dette miljøet besitter og har presentert. Det var få informanter som var brukere av ClassFronter. Samlet sett oppfattes ikke ClassFronter som et egnet verktøy. De som benyttet ClassFronter i undervisningen opplevde dette som arbeidskrevende og som tilleggsarbeid. ClassFronter støtter tradisjonell undervisning og hemmer nyutvikling. En informant ved Universitet i Oslo hadde et rammeverk som beskrev arbeidet som skulle utføres i ClassFronter, men dette var ikke en lærer. Videre kom det frem at det var liten faglig aktivitet i de elektroniske diskusjonsfora, og det var funksjonalitet som ikke var i bruk. Ved Universitet i Oslo ble det vist til systematisk forbedringsarbeid for å dyktiggjøre fagmiljøet i pedagogisk bruk av ClassFronter.
Oppgaven konkluderer blant annet med at det er et behov for å strukturere bruken av ClassFronter. Start med en håndterbar mal for arbeidet og bygg videre på denne gjennom systematisk forbedringsarbeid som inkluderer lærerne.
Pedersen, Trygve. Learning Management Systems (LMS) - Et pedagogisk luftslott?. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9559
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Pedersen, Trygve&rft.title=Learning Management Systems (LMS) - Et pedagogisk luftslott?&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-13470
46689
061248576
Learning Management Systems (LMS) - Et pedagogisk luftslott?
oai:www.duo.uio.no:10852/9363
2014-12-26T05:04:22Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
This report concentrates on presenting the predicament of life cycle mismatch between components and the system – described as component obsolescence. The problem seem to be most prevalent for microelectronics, where many electronic parts have life cycles that are shorter than the life cycle of the product they operate in. This particular life cycle mismatch caused by obsolete electronic parts may result in a significant increase in costs for long life systems. In particular, the military industry and their long life products experience this problem on a regular basis. The impact is not limited to the Defense industry, but the military has for several years acknowledged that obsolescence is a considerable challenge and therefore put major focus towards reducing part obsolescence. Thus, the following report looks at several of these efforts and presents the background of their proactive and reactive measures. Subsequently, in order to analyze the extent of the problem, as well as learning about common solutions exercised in the industry, a survey among the foremost Norwegian original equipment manufacturer (OEM) companies and contract manufacturers is presented. Based on the results from the survey and the background information, the report discovers that successful and effective component obsolescence reduction goes beyond
the common reactive thinking and instead focus on proactive management.
In fact, the most cost effective solutions for minimizing future component obsolescence does not include a complicated and/or costly endeavor. On the other hand, it includes a collaborative and proactive environment.
Løvland, Thor Arne. Component Obsolescence. Hovedoppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9363
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Løvland, Thor Arne&rft.title=Component Obsolescence&rft.inst=University of Oslo&rft.date=2005&rft.degree=Hovedoppgave
URN:NBN:no-11487
33373
060062606
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9363/1/Main_complete.pdf
Component Obsolescence
oai:www.duo.uio.no:10852/8826
2017-12-07T13:05:11Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2010
This paper considers computing accurate solutions on the interval [0, 1000] of ordinary differential equations. This includes implementation of high precision ode solvers and methods to verify the accuracy of the computed solution even for problems with chaotic behaviour. In this paper, we compute an accurate solution of the Lorenz system.
We integrate the DOLFIN ODE solver with the GNU Multiple Precision
Library (GMP) and are thus able to solve ODEs with arbitrary precision. We extend the ODE solver with general tools for a posteriori error analysis, including solving the linearized dual problem, and storing the primal solution and computing stability factors. In addition, we implement a number of optimizations in DOLFIN to make it possible to use methods with high order (∼ 200) with the solver.
Using these tools we study the computability of the Lorenz system in
detail and show that chaotic dynamical systems, like the Lorenz system, are indeed computable.
Kehlet, Benjamin Dam. Analysis and implementation of high-precision finite element methods for ordinary differential equations with application to the Lorenz system. Masteroppgave, University of Oslo, 2010
http://hdl.handle.net/10852/8826
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Kehlet, Benjamin Dam&rft.title=Analysis and implementation of high-precision finite element methods for ordinary differential equations with application to the Lorenz system&rft.inst=University of Oslo&rft.date=2010&rft.degree=Masteroppgave
URN:NBN:no-28102
110245
114645787
Fulltext https://www.duo.uio.no/bitstream/handle/10852/8826/1/Kehlet.pdf
Analysis and implementation of high-precision finite element methods for ordinary differential equations with application to the Lorenz system
oai:www.duo.uio.no:10852/10000
2014-12-26T05:04:22Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2003
Denne hovedfagsoppgaven tar for seg kombinasjonen av fagområdene mobil IT og knowledge management. Gjennom litteraturstudier og analyse av to knowledge management-prosjekter har jeg utviklet et rammeverk som kan bidra til at viktige spørsmål blir stilt i en behovsanalyse. I den forbindelse har jeg villet fastslå hvilke aspekter ved knowledge management og mobil IT som er sentrale i en behovsanalyse for slike systemer.
Både mobil IT og knowledge management har de siste årene vært
gjenstand for stor teknologisk utvikling, men kombinasjonen av mobil
IT og knowledge management har vist seg å by på mange utfordringer som krever en løsning.
Hovedproblemstillingen i denne oppgaven har vært å finne frem til en
måte å analysere de krav vi kan stille til knowledge management-systemer som skal brukes på mobile enheter. I den forbindelse har jeg sett på krav som må stilles til kunnskapen som benyttes, og designmessige aspekter ved bruk av knowledge management-systemer på mobile enheter.
Basert på erfaringer gjengitt i litteraturen laget jeg et analytisk
rammeverk. Det forbedrede knowledge management-rammeverket består av fire aspekter, kunnskap, teknologi, arbeidssituasjon og prosess. Rammeverket kan brukes til å gjennomføre en behovsanalyse for mobil knowledge management.
Analysen og diskusjonen er basert på litteraturstudiene. Funnene som
er beskrevet er gjort i forbindelse med analysen av to knowledge
management-systemer til bruk på mobile enheter: FieldWise og Pocket
Nauticus. Ved å anvende noen rammeverk på disse to systemene, fant jeg at spesielt prosess-aspektet var underutviklet.
Ved hjelp av rammeverket jeg har foreslått, har de to knowledge
management-systemene blitt analysert. På bakgrunn av analysen og
diskusjonen har jeg kommet frem til noen retningslinjer for hver av de fire komponentene som rammeverket består av. Forhåpentligvis vil
rammeverket og disse retningslinjene være med på å gjøre kombinasjonen av mobil IT og knowledge management mindre komplisert i fremtiden.
Borgersen, Kim Rune. Mobil kunnskapshåndtering. Hovedoppgave, University of Oslo, 2003
http://hdl.handle.net/10852/10000
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Borgersen, Kim Rune&rft.title=Mobil kunnskapshåndtering&rft.inst=University of Oslo&rft.date=2003&rft.degree=Hovedoppgave
URN:NBN:no-5186
8748
030454204
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10000/1/rammeverk.pdf
Mobil kunnskapshåndtering : Rammeverk for "Knowledge management"
oai:www.duo.uio.no:10852/9337
2014-12-26T05:04:23Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2002
The Bluetooth technology is starting to be common in accessories like
cellular phones and personal data assistants. Although Bluetooth was started as a project to replace the cables between cellular phones and its accessories, it is now seen as a more generic way of replacing cables between all devices. With Bluetooth, devices can exchange data without cables at distances up to 100 meters, and the user does not need to have the correct cable and plug to connect and exchange information between devices.
While the Bluetooth specification have advantageous characteristics like low power consumption and resistance to interference, the Bluetooth network topology can be difficult to follow. Bluetooth devices can be set up to initiate connections without user interaction,and devices can be connected to multiple devices at the same time. This makes it hard to know what the Bluetooth network topology looks like at a given time, and applications may not utilize the topology if there is no way to obtain information about it.
To solve this we introduce the concept of a Bluetooth network topology monitor. The monitor should be able to detect an initial network topology and changes that later occur in that topology. We first give a detailed description of Bluetooth and related technology so we can explore various methods the monitor can be constructed.
Borg, Fredrik. Monitoring Bluetooth network topology. Hovedoppgave, University of Oslo, 2002
http://hdl.handle.net/10852/9337
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Borg, Fredrik&rft.title=Monitoring Bluetooth network topology&rft.inst=University of Oslo&rft.date=2002&rft.degree=Hovedoppgave
URN:NBN:no-2722
2961
021239126
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9337/1/monitoring.pdf
Monitoring Bluetooth network topology
oai:www.duo.uio.no:10852/9924
2017-12-07T13:05:11Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
This thesis describes a system, Sysifos, that automates capturing and segmentation of screen dumps
of web pages. The system builds a model of the spatial structure of a page based on the segmentation.
The model is used when comparing two screen dumps. The system uses image analysis techniques to
segment the page. The model is then compared to a model generated earlier. The model comparison
is packaged in a form so that it may be used as a test oracle in standard Java testing frameworks, for
example Junit or TestnG. The motivation for the development of Sysifos is that there are currently no
established ways to automate testing of browser rendering, although browser related bugs are impor-
tant. In this thesis one operation running 56 high volume websites was investigated, and it was found
that browser related bugs represented around 13% of all bugs needing developer attention. Sysifos
was evaluated using a test set containing known errors. It found 100% of the errors it was expected to
find, but reported one false positive. Test results are visualized using SVG. This thesis show that using
a test oracle may be beneficial when testing browser rendering of web pages. Currently, the image
capturing service of Sysifos is not satisfactory according to speed and reliability, but the results of
the evaluation indicate that Sysifos has the potential to become a valuable tool if the image capturing
service can be improved.
Huse, Tarjei. Is using images to test web pages the solution to a Sisyphean task? . Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9924
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Huse, Tarjei&rft.title=Is using images to test web pages the solution to a Sisyphean task? &rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19825
79631
091919916
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9924/1/Huse.pdf
Is using images to test web pages the solution to a Sisyphean task?
oai:www.duo.uio.no:10852/10027
2014-12-26T05:04:25Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
Opera Mini TM is a mobile web browser. It was developed for mobile phones incapable of running regular web browsers and released globally in January 2006. It has gained great popularity and can be downloaded free of charge to your mobile phone. The Opera Mini client consists of a small Java MIDlet and the only requirement is that your mobile phone support Java ME applications. Opera Mini is a server-client technology. The main benefit of this compared to a regular mobile browser is that all web pages are being pre-processed and compressed in dedicated Opera servers before the information is sent to the mobile phones. This is to make the information more suitable for the small hand held devices as well as to reduce the amount of information transferred to the mobile phones. Opera claims that this technology increases the traffic rates by two to three times, which means better response time and lower cost for the end user. The fact that all traffic is handled by dedicated servers demands for extensive link and server capacity. Opera has a large cluster of Opera Mini servers connected with gigabit links. This is a huge cost for the company.
In this thesis we will analyze log files generated from Opera Mini traffic. We will keep main focus on parameters that affects the amount of data transferred to the mobile phones. Our motivation for this choice is that in a regular Opera Mini session, the tight link of the connection will often be the wireless connection between the mobile phone and the mobile operator [1]. This makes the level of compression of the data transferred from the Opera Mini server to the client of great importance for the user experience. Different data types will achieve different level of compression. Some data are more difficult to compress than other. Pictures, already compressed data and encrypted data will not be compressed as much as plain html files or text files [2]. The extensive use of images in web pages is the factor we believe have the greatest impact on the achieved level of compression. In most scenarios the user wants to view images, but the quality of the displayed images is of less importance. The regular Opera Mini user can choose between four different image quality settings in the browser. Our analysis reveal that these settings have impact on the results.
Østen, Stian. Analyzing the compression of Opera Mini TM traffic. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/10027
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Østen, Stian&rft.title=Analyzing the compression of Opera Mini TM traffic&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-21398
89352
091974496
Fulltext https://www.duo.uio.no/bitstream/handle/10852/10027/1/Oesten_Stian.pdf
Analyzing the compression of Opera Mini TM traffic
oai:www.duo.uio.no:10852/9118
2014-12-26T05:04:30Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2004
Diffpack is a collection of the C++ libraries used to numerically solve the partial differential equations PDEs, using the object oriented programming in the C++. The Diffpack version for the Unix/Linux platforms, has so far primarily been a collection of the simulators run from the command prompt only, unlike the Windows version where a fully interactive graphical user interface (GUI} has existed for a while. There have been some attempts to change this and several smaller GUIs, like gui.pl (for adjusting the simulation parameters and running the simulation), simresgui (for exporting the simulation results to different visualization formats) and vtkviz (for visualizing the simulation results in VTK), have been programmed, although none of these have been functioning as a unified application with GUI.
The aforementioned smaller GUIs (gui.pl, simresgui, vtkviz), bypass the command prompt and offer the user a slightly more intuitive option of a window based interaction with a simulator. These GUIs cover all of the aforementioned processes in the execution chain, but their main problem is their lack of interaction with each other, as well as being rather "old fashioned". Some of them are also offering only a few options to the user.
The main purpose of this thesis is therefore to make a GUI application
which would combine all these processes into one unit, and enable the
user to bypass the prompt line.
Hodzic, Zlatko. Diffpack GUI. Hovedoppgave, University of Oslo, 2004
http://hdl.handle.net/10852/9118
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Hodzic, Zlatko&rft.title=Diffpack GUI&rft.inst=University of Oslo&rft.date=2004&rft.degree=Hovedoppgave
URN:NBN:no-8948
17275
041251059
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9118/1/raport.pdf
Diffpack GUI : A portable and fully interactive GUI application
oai:www.duo.uio.no:10852/9224
2020-06-18T22:35:21Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2004
This thesis presents the introduction of Digital Rights Management (DRM) systems used to protect copyrighted content, why these systems are emphasized and by whom. Legal and technical aspects of such methods are also introduced. Moreover, progress in anti-piracy techniques and reasons for the current situation with online piracy are explained. In addition to presenting an alternative model for digital entertainment business, a new distribution system based on direct subscription on downloadable media files is suggested. Positive and negative aspects of these options are discussed, indicating how copyright owners and distributors will approach these challenges.
Syversen, Kristian. Digital Rights Management - Promises, Problems and Alternative Solutions. Hovedoppgave, University of Oslo, 2004
http://hdl.handle.net/10852/9224
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Syversen, Kristian&rft.title=Digital Rights Management - Promises, Problems and Alternative Solutions&rft.inst=University of Oslo&rft.date=2004&rft.degree=Hovedoppgave
URN:NBN:no-10135
23698
050113518
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9224/3/KSthesis.pdf
Digital Rights Management - Promises, Problems and Alternative Solutions
oai:www.duo.uio.no:10852/9767
2017-12-07T13:05:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2007
It is well known that software documentation in open source projects is often poor and incomplete. Open source communities are generally driven by project members doing what they want to do, and because few programmers enjoy writing documentation, many open source projects are poorly documented compared to proprietary projects. This does not mean that documentation is any less important in open source projects, and this thesis looks at why it is so hard to provide good documentation. Findings from this thesis shows that even if all project members agree that documentation is important, resource constraints mean that the time and effort necessary to create quality documentation it is not necessarily provided.
How lack of documentation is affecting new project members who try to contribute to a project is also described in this thesis. Several new project members found the given documentation to be messy and outdated, making it hard to contribute. Poor documentation can also influence the number of project members willing to contribute to the open source project.
The thesis is based on an action research project where the author has participated in the development of a health information system, District Health Information System version 2 (DHIS 2), within the Health Information System Programme (HISP) network.
Store, Margrethe. Explore the challenges of providing documentation in open source projects. Masteroppgave, University of Oslo, 2007
http://hdl.handle.net/10852/9767
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Store, Margrethe&rft.title=Explore the challenges of providing documentation in open source projects&rft.inst=University of Oslo&rft.date=2007&rft.degree=Masteroppgave
URN:NBN:no-15782
63969
071600779
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9767/2/store.pdf
Explore the challenges of providing documentation in open source projects
oai:www.duo.uio.no:10852/9522
2017-12-07T13:05:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
Multiple-valued logic is an alternative to traditional binary logic,
where the the two states, either on or off, is replaced by several
logical values. One promising way of realizing multiple-valued circuits is by using voltage-mode floating-gates with a recharge latching scheme. For the multiple-valued floating-gate circuits to yield the correct result, it is very important that the gain is accurate. In this master thesis we present a novel floating-gate inverter with varactor feedback that serves as a gain correction device, in that we can correct the gain by altering the varactor capacitance. We will be discussing the benefits and the limitations of the device, as well as describing it's performance in different circuit environments by software simulation.
Storm, Alf. Floating Gate Inverter with Varactor Feedback for Gain Correction in Multiple-Valued Circuits. Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9522
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Storm, Alf&rft.title=Floating Gate Inverter with Varactor Feedback for Gain Correction in Multiple-Valued Circuits&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12728
43246
061302686
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9522/1/Storm.pdf
Floating Gate Inverter with Varactor Feedback for Gain Correction in Multiple-Valued Circuits
oai:www.duo.uio.no:10852/9343
2013-03-12T07:58:31Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2005
Summary
This master thesis is a study of a specific process. When working with development of the security system for Øyafestivalen in Oslo, I wanted to improve the process and achieve a more structured process.
To do this I tried to use the Object Oriented Analysis and Design approach and then studied the process to evaluate how effective and fitting this methodology was with this kind of project.
Specifically areas addressed are:
Quality Management
Risk Management
Model based tools
o Rich Pictures
o UML
o State charts
o Event diagram
o Risk analysis
OOAD activities
Method
I participated in the development process myself and then did my own evaluations of according to my areas of interest. In addition to this I did 5 qualitative interviews, using an open not standardized approach.
Results
The methodology seemed to be fitting in many areas addressed. Most problems concerned the event business sector and their usual lack of structured projects. Especially the mix of object oriented thinking and risk analysis seems to be effective and reasonable, as well as improving quality. The use of different model based tools where successful, but not all of the tools where fitting or cost effective enough to be used in this business sector.
Svantorp, Kristian. Development of event security systems. Masteroppgave, University of Oslo, 2005
http://hdl.handle.net/10852/9343
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Svantorp, Kristian&rft.title=Development of event security systems&rft.inst=University of Oslo&rft.date=2005&rft.degree=Masteroppgave
URN:NBN:no-11151
30463
051402564
Development of event security systems : an object oriented approach
oai:www.duo.uio.no:10852/9451
2017-12-07T13:05:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2006
Shakari, Peyman. Object Oriented Feature Modeler (OOFM). Masteroppgave, University of Oslo, 2006
http://hdl.handle.net/10852/9451
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Shakari, Peyman&rft.title=Object Oriented Feature Modeler (OOFM)&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgave
URN:NBN:no-12368
40764
060963816
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9451/2/Shakari.pdf
Object Oriented Feature Modeler (OOFM) : A System Family Modeling Tool
oai:www.duo.uio.no:10852/9958
2017-12-07T13:05:12Z
com_10852_2
com_10852_1
col_10852_3
00925njm 22002777a 4500
dc
2008
This thesis describes a study performed in an industrial setting that attempts to build predictive models to identify parts of a Java system with a high fault probability. The system under consideration is constantly evolving as several releases a year are shipped to customers. Developers usually have limited resources for their testing, so our aim was to build optimal and practically useful fault-proneness prediction models to help focus verification and validation activities on the most fault-prone components of this system.
This thesis starts off with a literature review that provides detailed discussions of the state-of-the-art of research on fault-proneness prediction models. The review revealed that a vast number of modeling techniques have been used to build such prediction models. However, there has been little systematic effort on assessing the impact of selecting a particular modeling technique. Furthermore, there has been no systematic study of the impact of including certain, alternative types of measures as predictors. Finally, many studies apply certain evaluation methods and model assessment criteria that, depending on the intended use of the prediction model, might be insufficient or even inappropriate. Consequently, the main research focus of this thesis is to systematically assess three aspects on how to build and evaluate fault-proneness models in the context of a large Java legacy system development project: (1) compare many data mining and machine learning techniques to build fault-proneness models, (2) assess the impact of using different metric sets such as source code structural measures and historic change/fault (process) measures, and (3) compare several alternative ways of assessing the performance of the models, in terms of (i) confusion matrix criteria such as accuracy and precision/recall, (ii) ranking ability, using the receiver operating characteristic area (ROC), and (iii) our proposed cost-effectiveness measure (CE).
The results of the study indicate that the choice of modeling technique has limited impact on the resulting classification accuracy or cost-effectiveness. There is however large differences between the individual metric sets in terms of cost-effectiveness, and although the process measures are among the most expensive ones to collect, including them as candidate measures significantly improves the prediction models compared with models that only include structural measures and/or their deltas – both in terms of ROC area and in terms of cost-effectiveness. Furthermore, we observe that what is considered the best model is highly dependent on the criteria that are used to evaluate and compare the models. The regular confusion matrix criteria, although popular, are not clearly related to the problem at hand, namely the cost-effectiveness of using fault-proneness prediction models to focus verification efforts to deliver software with less faults at less cost. Consequently, to assess the usefulness of prediction models, we consider the regular confusion matrix criteria of less importance, and recommend to rather use ROC and our proposed measure of cost-effectiveness. Another contribution of this thesis is the provision of a statistically based method for the systematic comparison of fault-proneness prediction models. The method can be reused in future studies to guide the selection of optimal prediction models.
Johannessen, Eivind Berg. Data mining techniques, candidate measures and evaluation methods for building practically useful fault-proneness prediction models. Masteroppgave, University of Oslo, 2008
http://hdl.handle.net/10852/9958
info:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Johannessen, Eivind Berg&rft.title=Data mining techniques, candidate measures and evaluation methods for building practically useful fault-proneness prediction models&rft.inst=University of Oslo&rft.date=2008&rft.degree=Masteroppgave
URN:NBN:no-19845
82051
091963818
Fulltext https://www.duo.uio.no/bitstream/handle/10852/9958/1/johannessen.pdf
Data mining techniques, candidate measures and evaluation methods for building practically useful fault-proneness prediction models
MToxMDB8Mjpjb2xfMTA4NTJfM3wzOnw0Onw1Om1hcmM=