Jump to main content Hotkeys
Distributed and Self-organizing Systems
Distributed and Self-organizing Systems
Seminar Web Engineering (SS 2023)


Seminar Web Engineering (SS 2023)

Welcome to the homepage of the Seminar Web Engineering

This website contains all important information about the seminar, including links to available topics as well as information about the seminar process in general.

The interdisciplinary research area Web Engineering develops approaches for the methodological construction of Web-based applications and distributed systems as well as their continuous development (evolution). For instance, Web Engineering deals with the development of interoperable Web Services, the implementation of web portals using service-oriented architectures (SOA), fully accessible user interfaces or even exotic web-based applications that are voice controlled via the telephone or that are represented on TV and Radio.

The following steps are necessary to complete the seminar:

  • Preparation of a presentation about the topic assigned to you.
  • An additional written report of your topic.
  • Each report is reviewed by two or three other particpants.

Seminar chairs

traubinger

haas

gaedke


Contact

If you have any questions concerning this seminar or the exam as a participant, please contact us via OPAL.

We also offer a Feedback system, where you can provide anonymous feedback for a partiular session to the presenter on what you liked or where we can improve.

Participants

The seminar is offered for students of the following programmes (for pre-requisites, please refer to your study regulations):

If your programme is not listed here, please contact us prior to seminar registration and indicate your study programme, the version (year) of your study regulations (Prüfungsordnungsversion) and the module number (Modulnummer) to allow us to check whether we can offer the seminar for you and find an appropriate mapping.

Registration

You may only participate after registration in the Seminar Course in OPAL

The registration opens on 27.03.2023 and ends on 07.04.2023 at 23:59. As the available slots are usually rather quickly booked, we recommend to complete your registration early after registration opens.

Topics and Advisors

Questions:

  • How does a Systematic Literature Review work? Prepare a guideline for computer science students explaining the main aspects and include a list of relevant publications search engines/catalogues.
  • What does "systematic" mean in SLR, how is it different from other literature review methods? How does it compare to a Structured Literature Review? How does it compare to a Systematic Mapping Studies? What are risks and limitations of the method?
  • How are research questions represented/quantified? What does coding mean in this context?
  • How are search queries constructed? Explain the technique of query expansion for generating additional queries.
  • Which SLR artifacts should be provided to allow for reproducibility and replicability?
  • What tools exist to support SLRs? Demonstrate a suitable tool.

Literature:

  • Kitchenham, B. (2004). Procedures for Undertaking Systematic Reviews. https://www.inf.ufsc.br/~aldo.vw/kitchenham.pdf
  • Kitchenham, B., Pearl Brereton, O., Budgen, D., Turner, M., Bailey, J., & Linkman, S. (2009). Systematic literature reviews in software engineering - A systematic literature review. Information and Software Technology, 51(1), 7–15.
  • Brereton, P., Kitchenham, B. a., Budgen, D., Turner, M., & Khalil, M. (2007). Lessons from applying the systematic literature review process within the software engineering domain. Journal of Systems and Software, 80(4), 571–583.
  • Petersen, K., Vakkalanka, S., & Kuzniarz, L. (2015). Guidelines for conducting systematic mapping studies in software engineering: An update. Information and Software Technology, 64, 1–18.
  • Díaz, O., Medina, H., & Anfurrutia, F. I. (2019). Coding-Data Portability in Systematic Literature Reviews. Proceedings of the Evaluation and Assessment on Software Engineering - EASE ’19, 178–187.
  • Khadka, R., Saeidi, A. M., Idu, A., Hage, J., & Jansen, S. (2013). Legacy to SOA Evolution: A Systematic Literature Review. In A. D. Ionita, M. Litoiu, & G. Lewis (Eds.), Migrating Legacy Applications: Challenges in Service Oriented Architecture and Cloud Computing Environments (pp. 40–71). IGI Global.
  • Jamshidi, P., Ahmad, A., & Pahl, C. (2013). Cloud Migration Research: A Systematic Review. IEEE Transactions on Cloud Computing, 1(2), 142–157.
  • Rai, R., Sahoo, G., & Mehfuz, S. (2015). Exploring the factors influencing the cloud computing adoption: a systematic study on cloud migration. SpringerPlus, 4(1), 197.
  • A. Hinderks, F. José, D. Mayo, J. Thomaschewski and M. J. Escalona, "An SLR-Tool: Search Process in Practice : A tool to conduct and manage Systematic Literature Review (SLR)," 2020 IEEE/ACM 42nd International Conference on Software Engineering: Companion Proceedings (ICSE-Companion), 2020, pp. 81-84.
  • PRISMA 2020 http://www.prisma-statement.org/

Questions:

  • What is empirical Software Engineering Evaluation and how can it be done?
  • Why is evaluation important in the research process?
  • What is the difference between a qualitative and quantitative evaluation? (When do you use which one? What are advantages and disadvantages?)
  • Prepare a list of evaluation methods and tools that can be used to evaluate software. Explain them and add relevant literature for these methods.
  • Demonstrate one quantitative and one qualitative method. For this, find a feasible research question, conduct a survey on it with each of the two methods, compute the results, discuss and present them. You can choose a low level topic on your own that is related to Web Engineering. Use statistical methods to compute the results.

Literature:

  • Own research
  • Creswell, J. W. (2014). Research design : Qualitative, quantitative, and mixed methods approaches (4. ed., in). SAGE. https://katalog.bibliothek.tu-chemnitz.de/Record/0008891954
  • Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., & Wesslén, A. (2012). Experimentation in Software Engineering. In Experimentation in Software Engineering (Vol. 9783642290). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-29044-2
  • Chatzigeorgiou, A., Chaikalis, T., Paschalidou, G., Vesyropoulos, N., Georgiadis, C. K., & Stiakakis, E. (2015). A Taxonomy of Evaluation Approaches in Software Engineering. Proceedings of the 7th Balkan Conference on Informatics Conference - BCI ’15, 1–8. https://doi.org/10.1145/2801081.2801084
  • Wainer, J., Novoa Barsottini, C. G., Lacerda, D., & Magalhães de Marco, L. R. (2009). Empirical evaluation in Computer Science research published by ACM. Information and Software Technology, 51(6), 1081–1085. https://doi.org/10.1016/j.infsof.2009.01.002

Questions:

  • What is design science research? What are the objectives of design science research?
  • Which are activities? How research is conducted? How results are evaluated?
  • In which research areas of computer science this methodology is more practical?
  • Using design science research produce a viable and simplified artifact in the form of a construct, a model, a method and demonstrate activities

Literature:

  • Own research
  • Johannesson Perjons (2021), An Introduction to Design Science, https://link.springer.com/book/10.1007/978-3-030-78132-3

Questions:

  • How is a scientific work, especially a thesis in computer science, structured? What sections should a thesis contain and what purpose do they have? Give an overview.
  • What is the importance of an introduction? What should it contain? How long should it be? How is it related to the other sections in a scientific work?
  • What makes a "good" motivation for your scientific work? Why is it important for the readers? What are methods to write it so the reader can relate to the writer?
  • What is the scope of a scientific work? Why is it important? How should you include the scope in the introduction?
  • What are current and well known best practices/guidelines/schemes/principles/advices etc.? Which evidence base (e.g. experimental studies) are supporting them? Present them.
  • Choose a sufficient scientific work and work out a way to visually represent its whole structure. Show how the introduction, motivation and scope relate to the other parts.

Literature:

  • Own research
  • Peat, J., Elliott, E., Baur, L., & Keena, V. (2013). Scientific writing: easy when you know how. John Wiley & Sons. DOI:10.1002/9781118708019
  • X Barbara Minto: The Pyramid Principle. Pearson Education, 2009.
  • Mensh, B., & Kording, K. (2017). Ten simple rules for structuring papers. PLoS computational biology, 13(9), e1005619. DOI: https://doi.org/10.1371/journal.pcbi.1005619
  • J. M. Setchell, “Writing a Scientific Report,” in Studying Primates: How to Design, Conduct and Report Primatological Research, Cambridge: Cambridge University Press, 2019, pp. 271–298.
  • Blackwell, J., & Martin, J. (2011). A scientific approach to scientific writing. Springer Science & Business Media.
  • Williams, J. M., & Bizup, J. (2014). Lessons in clarity and grace. Pearson.
  • Oguduvwe, J. I. P. (2013). Nature, Scope and Role of Research Proposal in Scientific Investigations. IOSR Journal Of Humanities And Social Science (IOSR-JHSS), 17(2), 83-87. https://www.iosrjournals.org/iosr-jhss/papers/Vol17-issue2/L01728387.pdf

Questions:

  • What is the current state of research in quantum web services? Which groups are doing research in this field? What are the current research challenges? Explain the mis-match between the computing, mathematics and physics on the one hand side and the software engineering perspective on the other side.
  • Can web services in general benefit from quantum computing, or is it rather applicable to specific problems? Which kinds of web services could benefit the most from quantum capabilities?
  • What are current capabilities of quantum computers? Which features are currently not possible especially with regard to web services?
  • What are current practical problems assuming you have a software solution that could benefit from quantum computing and want to provide it as quantum web service? How do the resulting hybrid software architectures of quantum web services generally look like?
  • What are relevant langauges for modeling and tools/platforms for developing, testing and hosting quantum computing based software? How do they compare? Prepare a demo with a suitable tool/platform.

Questions:

  • What guidelines and principles exist to safeguard Good Research Practice?
  • How can these guidelines and principles be integrated into the research process?
  • What is scientific misconduct/scientific malpractice?
  • What is considered "high-quality research"? What are indicators thereof?

Literature:

  • Guidelines for Safeguarding Good Research Practice https://www.dfg.de/download/pdf/foerderung/rechtliche_rahmenbedingungen/gute_wissenschaftliche_praxis/kodex_gwp_en.pdf
  • The European Code of Conduct for Research Integrity http://www.allea.org/wp-content/uploads/2017/03/ALLEA-European-Code-of-Conduct-for-Research-Integrity-2017-1.pdf
  • Open Research Data and Data Management Plans https://erc.europa.eu/sites/default/files/document/file/ERC_info_document-Open_Research_Data_and_Data_Management_Plans.pdf
  • Own research

Questions:

  • What are mashups?
  • What are the different techniques for creating a Voice-based/Chat-bot based Web-based Mashup?
  • How can an existing mashup be modified with natural interactions?

Questions:

  • Why is explainable planning important in the context of end users?
  • What does it mean for the behaviour of a AI planner to be explainable?
  • What are the challenges and solutions?

Questions:

  • What is MLOps? How does it relate to DevOps and CI/CD? Which patterns/best practices/principles exist? Which phases of ML lifecycle (e.g. dataset creation, feature engineering, training, predictions/usage, re-training/model updates) are supported and in which ways?
  • Which methods/techniques from DevOps does it apply? What are characteristics specific to ML approaches that require a different approach?
  • Which tools and platforms exist to support MLOps? Prepare a demonstration using suitable tools for an MLOps scenario.

Questions:

  • What does the Term Open Science mean? How many scientific works are published in this way?
  • Look at the following terms: Open Access, Open Data and Open Source? Give an overview over each topic, what is different to the traditional publishing process and what is important for each step.
  • What is the difference to the FAIR principles? What are differences to publishing works on researchgate, arxiv, etc.?
  • How does the review process work in Open Access Publishing? What are funding possibilities?
  • For the demonstration, show which platforms are available for each aspect from TU Chemnitz for publishing a paper and where you may need to use another platform. Also give a brief overview over similar platforms and tools.

Questions:

  • Which technologies, frameworks, platforms and architectures are currently used by major web and mobile applications (at least 100)?
  • To answer the question, conduct a survey gathering publicly available information from web sources and identification through tools/analysis of HTTP headers etc.
  • Per each application, identify the basic architecture with the main top-level components (e.g. Mobile App, API Gateway, Loadbalancer, Database, Indexer, etc.). For these components, identify the technologies, frameworks and platforms used (e.g. programming language, frameworks for application logic and presentation, DBMS, web server, cloud platform etc.) and for each of these datapoints keep track of the source of information (e.g. the URL of the corresponding web resouce, the endpoint and tool used for analysis, the binary artifact analyzed etc.)
  • Further analyze the survey results to identify groups of similar technology stacks, patterns, frequencies of technologies per each component type, common architectures etc.

Literature:

  • Survey at least 100 major web and mobile applications, below are some examples
  • Search Engines: Google Search, DuckDuckGo, Yahoo Search
  • E-Commerce: Ebay, Amazon, zalando, Alibaba, rakuten
  • Social Media (web apps): Facebook, Twitter, Youtube
  • Social Media (mobile apps): WhatsApp, TikTok, Snapchat, Telegram, Twitter, Facebook
  • Super Apps: WeChat, Grab, Omni, Alipay

Questions:

  • What are well known ways to query knowledge graphs like a Web API?
  • What are good ways to integrate data from Knowledge Graphs into Web Applications?
  • 'What are limits of Knowledge Graphs' Data usage in Web Applications?

Questions:

  • How can distributed data be acquired trustworthy with goverance?
  • What is a reasonable approach for introducing goverance to common web application architectures like MVC?
  • What are limits and open challenges for goverance for trustworthy data acquisition?

Questions:

  • What is the current state of Web Engineering research? To answer this question, systematically analyze all publications of the 2 venues listed under literature as detailed below. Your primary information used should be the title, authors/affiliations, keywords, and abstract.
  • For each publication, capture title, authors/affiliations, keywords, and abstract, venue, year, (for conference papers) name of track/workshop, (for journal articles) volume number, (for journal articles) issue number, (for journal articles) name of issue, page numbers in proceedings/issue, length of the publication, current number of citations of the publication, URL of online resource.
  • Based on your raw data collection, analyze the following aspects: 1. What are the main topics of research interest and in which areas of the Web Engineering field, along with the number of publications belonging to them? 2. What authors are publishing in these venues, from which affiliations, from which countries, along with the number of publications for each of these? 3. Which are the most cited articles (relative to their age), which topics/areas receive the most citations, which authors/affiliations/countries receive the most citations? 4. Considering the time dimension, are there any visible trends for aspects 1-3 over the 5 years considered?
  • Visualize your data and insights and provide the raw data in re-usable form (CSV).

Literature:

  • Venue 1: ICWE Proceedings of last 5 complete years (2018-2022)
  • Venue 2: JWE Journals Issues of last 5 complete years (2018-2022)
  • For citation counts use: Google Scholar
  • Tool for analysis and inspiration for your data visualization: https://www.connectedpapers.com/

Questions:

  • What are dark patterns? Provide a brief overview on existing definitions and outline how to differentiate them from poor usability/UIX design.
  • Which taxonomies for dark patterns exist? What are the criteria/dimensions that they use? Are there regional/cultural/language differences?
  • Provide an overview on existing studies, datasets and empirical expriments about dark patterns. Which aspects are they focussed on? What kind of evidence is provided?

Literature:

  • Arunesh Mathur, Gunes Acar, Michael J. Friedman, Eli Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan. 2019. Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 81 (November 2019), 32 pages. https://doi.org/10.1145/3359183
  • Arvind Narayanan, Arunesh Mathur, Marshini Chetty, and Mihir Kshirsagar. 2020. Dark Patterns: Past, Present, and Future: The evolution of tricky user interfaces. Queue 18, 2, Pages 10 (March-April 2020), 26 pages. https://doi.org/10.1145/3400899.3400901
  • Johanna Gunawan, Amogh Pradeep, David Choffnes, Woodrow Hartzog, and Christo Wilson. 2021. A Comparative Study of Dark Patterns Across Web and Mobile Modalities. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 377 (October 2021), 29 pages. https://doi.org/10.1145/3479521
  • Arunesh Mathur, Mihir Kshirsagar, and Jonathan Mayer. 2021. What Makes a Dark Pattern... Dark? Design Attributes, Normative Considerations, and Measurement Methods. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). Association for Computing Machinery, New York, NY, USA, Article 360, 1–18. https://doi.org/10.1145/3411764.3445610
  • Aditi M. Bhoot, Mayuri A. Shinde, and Wricha P. Mishra. 2021. Towards the Identification of Dark Patterns: An Analysis Based on End-User Reactions. In Proceedings of the 11th Indian Conference on Human-Computer Interaction (IndiaHCI '20). Association for Computing Machinery, New York, NY, USA, 24–33. https://doi.org/10.1145/3429290.3429293
  • Kai Lukoff, Alexis Hiniker, Colin M. Gray, Arunesh Mathur, and Shruthi Sai Chivukula. 2021. What Can CHI Do About Dark Patterns? In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (CHI EA '21). Association for Computing Machinery, New York, NY, USA, Article 102, 1–6. https://doi.org/10.1145/3411763.3441360
  • Thomas Mildner and Gian-Luca Savino. 2021. Ethical User Interfaces: Exploring the Effects of Dark Patterns on Facebook. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (CHI EA '21). Association for Computing Machinery, New York, NY, USA, Article 464, 1–7. https://doi.org/10.1145/3411763.3451659 Titel anhand dieser DOI in Citavi-Projekt übernehmen
  • Karagoel, I., & Nathan-Roberts, D. (2021). Dark Patterns: Social Media, Gaming, and E-Commerce. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 65(1), 752–756. https://doi.org/10.1177/1071181321651317
  • Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs. 2018. The Dark (Patterns) Side of UX Design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). Association for Computing Machinery, New York, NY, USA, Paper 534, 1–14. https://doi.org/10.1145/3173574.3174108
  • Madison Fansher, Shruthi Sai Chivukula, and Colin M. Gray. 2018. #darkpatterns: UX Practitioner Conversations About Ethical Design. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA '18). Association for Computing Machinery, New York, NY, USA, Paper LBW082, 1–6. https://doi.org/10.1145/3170427.3188553
  • Yvonne Rogers, Margot Brereton, Paul Dourish, Jodi Forlizzi, and Patrick Olivier. 2021. The Dark Side of Interaction Design. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (CHI EA '21). Association for Computing Machinery, New York, NY, USA, Article 152, 1–2. https://doi.org/10.1145/3411763.3450397

Questions:

  • Compare the Language Models of AI Chatbots (at least 6). Build a meaningful taxonomy for this and apply it on these chatbots.
  • You can use the following questions as leads on your specifications: Which Language Model is used? Who has access? What is its Training Data set? How do they implement Privacy? What is its Response Time? How consistent is it? What is the Truthfulness of their answers? Is their Code Open Source? Which companies are involved? Add your own specifications.
  • Build an AI chatbot application in the context of the university, for example for planning the schedule, giving information on study courses, etc. Demonstrate your implementation.

Literature:

  • Janssen, A., Passlick, J., Rodríguez Cardona, D. et al. Virtual Assistance in Any Context. Bus Inf Syst Eng 62, 211–225 (2020). https://doi.org/10.1007/s12599-020-00644-1
  • Adamopoulou, E., & Moussiades, L. (2020). An overview of chatbot technology. In Artificial Intelligence Applications and Innovations: 16th IFIP WG 12.5 International Conference, AIAI 2020, Neos Marmaras, Greece, June 5–7, 2020, Proceedings, Part II 16 (pp. 373-383). Springer International Publishing. https://doi.org/10.1007/978-3-030-49186-4_ 31
  • Own research
  • tba

Questions:

  • What are Conversational User Interfaces (CUIs)? What types do exist? What are they used for? What is the difference to Graphical User Interfaces (GUIs)?
  • Which accessibility guidelines can be used on CUIs? Collect them in guidelines with necessary accessibility rules.
  • Choose a CUI type and show in an application how it can be made accessible.

Literature:

  • Own Research
  • Kate Lister, Tim Coughlan, Francisco Iniesto, Nick Freear, and Peter Devine. 2020. Accessible conversational user interfaces: considerations for design. In Proceedings of the 17th International Web for All Conference (W4A '20). Association for Computing Machinery, New York, NY, USA, Article 5, 1–11. https://doi.org/10.1145/3371300.3383343
  • https://www.w3.org/TR/wcag-3.0/
  • Story, M.F.: Principles of universal design. Universal design handbook, Second Edition, McGraw-Hill, 2001.

Questions:

  • What is the objective of the EBSI?
  • For which use cases can the EBSI be utilised?
  • What architecture does the EBSI have and what technologies are used?
  • What is the Verifiable Credentials Data Model? What can it be used for? How does it relate to the EBSI?

Questions:

  • What is Low-Code / No-Code? What is the idea behind it? What are Low Code Development Platforms?
  • How does the Low-Code / No-Code approach compare to Model Driven Engineering?
  • For which use cases has it been applied successfuly?
  • Which challenges and limitations exist to the Low-Code / No-Code approach?
  • How can the approach help in the context of Digital Transformation?

Literature:

  • D. Di Ruscio, D. Kolovos, J. de Lara, A. Pierantonio, M. Tisi, and M. Wimmer, “Low-code development and model-driven engineering: Two sides of the same coin?,” Softw Syst Model, vol. 21, no. 2, pp. 437–446, Apr. 2022, doi: 10.1007/s10270-021-00970-2.
  • V. S. Phalake and S. D. Joshi, “Low Code Development Platform for Digital Transformation,” in Information and Communication Technology for Competitive Strategies (ICTCS 2020), M. S. Kaiser, J. Xie, and V. S. Rathore, Eds., in Lecture Notes in Networks and Systems. Singapore: Springer Nature, 2021, pp. 689–697. doi: 10.1007/978-981-16-0882-7_61.
  • A. Sahay, A. Indamutsa, D. Di Ruscio, and A. Pierantonio, “Supporting the understanding and comparison of low-code development platforms,” in 2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), Portoroz, Slovenia: IEEE, Aug. 2020, pp. 171–178. doi: 10.1109/SEAA51224.2020.00036.
  • N. Prinz, C. Rentrop, and M. Huber, “Low-Code Development Platforms – A Literature Review”.
  • Own research

Questions:

  • What is Trust Evaluation?
  • How to meassure accuracy of trust evaluations?
  • What is a good way to implement such a trust evaluation accuracy test in a web-based testbed, like aTLAS?

Questions:

  • What approaches/technologies exist to tackle Authentication and Authorization in Decentralized Knowledge Graphs and Web Data?
  • What are open challenges on Authentication for decentralized knowledge graphs?
  • What are open challenges on Authorization policies for decentralized knowledge graphs?

Questions:

  • What algorithms for indexing vector data exist?
  • How well do they perform with regards to memory consuption, speed of retrieval and accuracy?
  • What is the use of a vector database?
  • What metrics can be used for querying a vector in a vector database?

Seminar Opening

The date and time of the seminar opening meeting will be announced via OPAL.

Short Presentation

The date and time of the short presentations will be announced via OPAL.

In your short presentation, you will provide a brief overview on your selected topic.

This includes the following aspects:

  1. What is in your topic?
  2. Which literature sources did you research so far?
  3. What is your idea for a demonstration?

Following your short presentations, the advisors will provide you with feedback and hints for your full presentations.

Hints for your Presentation

  • As a rule of thumb, you should plan 2 minutes per slide. A significantly higher number of slides per minute exceeds the perceptive capacity of your audience.
  • Prior to your presentation, you should consider the following points: What is the main message of my presentaion? What should the listeners take away?
    Your presentation should be created based on these considerations.
  • The following site provides many good hints: http://www.garrreynolds.com/preso-tips/

Seminar Days

The date and time of the seminar opening meeting will be announced via OPAL.

Report

  • Important hints on citing:
    • Any statement which does not originate from the author has to be provided with a reference to the original source.
    • "When to Cite Sources" - a very good overview by the Princeton University
    • Examples for correct citation can be found in the IEEE-citation reference
    • Web resources are cited with author, title and date including URL and Request date. For example:
      • [...] M. Nottingham and R. Sayre. (2005). The Atom Syndication Format - Request for Comments: 4287 [Online]. Available: http://www.ietf.org/rfc/rfc4287.txt (18.02.2008).
      • [...] Microsoft. (2015). Microsoft Azure Homepage [Online]. Available: http://azure.microsoft.com/ (23.09.2015).
      • A url should be a hyperlink, if it is technically possible. (clickable)
  • Further important hints for the submission of your written report:
    • Use apart from justifiable exceptions (for instance highlight of text using <strong>...</strong>) only HTML elements which occur in the template. The CSS file provides may not be changed.
    • Before submitting your work, carefully check spelling and grammar, preferably with software support, for example with the spell checker of Microsoft Word.
    • Make sure that your HTML5 source code has no errors. To check your HTML5 source code, use the online validator of W3.org
    • For submission compress all necessary files (HTML, CSS, images) using a ZIP or TAR.GZ.

Review

  • Each seminar participant has to review exactly three reports. The reviews are not anonymous.
  • Use the review forms provided in the VSR Seminar Workflow, one per report.
  • Following the review phase, each seminar participant will receive the three peer reviews of his or her report and, if necessary, additional comments by the advisors. You will then have one more week to improve your report according to the received feedback.
  • The seminar grade will consider the final report.
    All comments in the reviews are for improving the text and therefore in the interest of the author.

Press Articles