Jump to main content Hotkeys
Distributed and Self-organizing Systems
Distributed and Self-organizing Systems
Seminar Web Engineering (WS 2021/2022)


Seminar Web Engineering (WS 2021/2022)

Welcome to the homepage of the Seminar Web Engineering

This website contains all important information about the seminar, including links to available topics as well as information about the seminar process in general.

The interdisciplinary research area Web Engineering develops approaches for the methodological construction of Web-based applications and distributed systems as well as their continuous development (evolution). For instance, Web Engineering deals with the development of interoperable Web Services, the implementation of web portals using service-oriented architectures (SOA), fully accessible user interfaces or even exotic web-based applications that are voice controlled via the telephone or that are represented on TV and Radio.

The following steps are necessary to complete the seminar:

  • Preparation of a presentation about the topic assigned to you.
  • An additional written report of your topic.
  • Each report is reviewed by two or three other particpants.

Seminar chairs

traubinger

christophgoepfert

gaedke


Contact

If you have any questions concerning this seminar or the exam as a participant, please contact us via OPAL.

We also offer a Feedback system, where you can provide anonymous feedback for a partiular session to the presenter on what you liked or where we can improve.

Participants

The seminar is offered for students of the following programmes (for pre-requisites, please refer to your study regulations):

If your programme is not listed here, please contact us prior to seminar registration and indicate your study programme, the version (year) of your study regulations (Prüfungsordnungsversion) and the module number (Modulnummer) to allow us to check whether we can offer the seminar for you and find an appropriate mapping.

Registration

You may only participate after registration in the Seminar Course in OPAL

The registration opens on 15.10.2021 at 12:00 and ends on 21.10.2021 at 23:59. As the available slots are usually rather quickly booked, we recommend to complete your registration early after registration opens.

Topics and Advisors

Questions:

  • What are the common and what are the special design characteristics of web-based testbeds?
  • Which artifacts qualify a testbed to be developed web-based?
  • What are current limitations in web-based testbeds?

Literature:

  • own research

Questions:

  • Which UI metrics for the automatic evaluation of web user interfaces exist? How can they be grouped/categorized (e.g. aesthetics, complexity, usability, accessibility)? What are target/optimum values for those metrics? To which usability aspects are the invidual metrics related (efficiency, effectiveness, satisfaction, learnability, safety, trustfullness, accessibility, unversality, usefulness)?
  • How are they computed? Which methods are used to compute them? Based on what input data are they operating? What are hardware/software requirements for their computation?
  • How much empirical evidence exists that they are correctly working/predicting human perception of user interfaces? Which numbers of test subjects were used in the related experiments, what is the statistical power? Which use cases in industry projects are reported?

Literature:

  • https://vimeo.com/159666829
  • https://spectrum.library.concordia.ca/2024/
  • Ngo, D. C. L., & Byrne, J. G. (2001). Another look at a model for evaluating interface aesthetics. International Journal of Applied Mathematics and Computer Science, 11(2), 515–535. https://doi.org/10.1.1.137.8784
  • Dou, Q., Zheng, X. S., Sun, T., & Heng, P. A. (2019). Webthetics: Quantifying webpage aesthetics with deep learning. International Journal of Human Computer Studies, 124, 56–66. https://doi.org/10.1016/j.ijhcs.2018.11.006
  • Bakaev, M., Mamysheva, T., & Gaedke, M. (2016). Current trends in automating usability evaluation of websites: Can you manage what you can’t measure? 2016 11th International Forum on Strategic Technology (IFOST), 510–514. https://doi.org/10.1109/IFOST.2016.7884307
  • Miniukovich, A., & De Angeli, A. (2015). Computation of Interface Aesthetics. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015-April, 1163–1172. https://doi.org/10.1145/2702123.2702575
  • Riegler, A., & Holzmann, C. (2018). Measuring Visual User Interface Complexity of Mobile Applications With Metrics. Interacting with Computers, 30(3), 207–223. https://doi.org/10.1093/iwc/iwy008
  • Zen, M., & Vanderdonckt, J. (2014). Towards an evaluation of graphical user interfaces aesthetics based on metrics. 2014 IEEE Eighth International Conference on Research Challenges in Information Science (RCIS), 1–12. https://doi.org/10.1109/RCIS.2014.6861050
  • Michailidou, E., Eraslan, S., Yesilada, Y., & Harper, S. (2021). Automated prediction of visual complexity of web pages: Tools and evaluations. International Journal of Human-Computer Studies, 145, 102523. https://doi.org/10.1016/j.ijhcs.2020.102523
  • https://interfacemetrics.aalto.fi/
  • Oulasvirta, A., De Pascale, S., Koch, J., Langerak, T., Jokinen, J., Todi, K., Laine, M., Kristhombuge, M., Zhu, Y., Miniukovich, A., Palmas, G., & Weinkauf, T. (2018). Aalto Interface Metrics (AIM). The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings, 16–19.
  • https://doi.org/10.1145/3266037.3266087
  • https://researcher.watson.ibm.com/researcher/view_group.php?id=2238

Questions:

  • What is it?
  • How is it used in IoT?
  • What are the benefits and use cases?
  • How does it compare to existing Web standards like HTML, HTTP?

Questions:

  • What are GOMS/KLM Models? How do they work? Why are they used? What is the (data) basis on which they were created?
  • For what kinds of user interfaces can they be or have they been applied? What are their limitations?
  • Apply GOMS modeling to real world examples (e.g. vsr website) and demonstrate how they can be used to improve these interfaces.

Literature:

  • https://cogulator.io/
  • https://syntagm.co.uk/design/klmcalc.shtml
  • https://www.cogtool.org/
  • Card, S. K., Moran, T. P., & Newell, A. (1983). The psychology of human-computer interaction. Hillsdale, N.J. : L. Erlbaum Associates.
  • Kieras, D. (1997). A guide to GOMS model usability evaluation using NGOMSL (Chapter 31). In M. Helander, T.K. Landauer & P.V. Prabhu (Eds.), Handbook of Human-Computer Interaction. Amsterdam: North-Holland Elsevier Science Publishers. Kim,
  • John, B. and Kieras, D. The GOMS family of user interface analysis techniques: comparison and contrast. ACM TOCHI, 3 (4). 1996. 320-351.
  • John, B. E. (2010). CogTool: Predictive human performance modeling by demonstration. 19th Annual Conference on Behavior Representation in Modeling and Simulation 2010, BRiMS 2010, 308–309.

Questions:

  • What are the rules/guidelines for authoring clean code? What are the underlying assumptions? Find empirical evidence/studies that have tested these assumptions.
  • Formulate hypotheses and create a quantitative or mixed-methods research design to test Clean Code rules.
  • Conduct the experiment for one Clean Code rule.

Literature:

  • Martin, R. C. (2008). Clean Code: A Handbook of Agile Software Craftsmanship (1st ed.). Prentice Hall International. http://books.google.de/books?id=_i6bDeoCQzsC&printsec=frontcover&dq=clean+code&hl=de&sa=X&ei=0Y6_T46hDYySswbJkYz0Cg&ved=0CDcQ6AEwAA#v=onepage&q=tight coupling&f=false
  • Martin, R. C. (2011). Clean Coder: A Code of Conduct for Professional Programmers (1 edition). Prentice Hall.
  • https://www.youtube.com/watch?v=ZsHMHukIlJY
  • Kitchenham, B. A., Dyba, T., & Jorgensen, M. (2004). Evidence-based software engineering. Proceedings. 26th International Conference on Software Engineering, 26, 273–281. https://doi.org/10.1109/ICSE.2004.1317449
  • Creswell, J. W. (2014). Research design : Qualitative, quantitative, and mixed methods approaches (4. ed., in). SAGE. https://katalog.bibliothek.tu-chemnitz.de/Record/0008891954
  • Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., & Wesslén, A. (2012). Experimentation in Software Engineering. In Experimentation in Software Engineering (Vol. 9783642290). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-29044-2

Questions:

  • What is it?
  • What are key features of GatsbyJS?
  • Compare it with other frontend JS frameworks/libraries (pros/con)?

Questions:

  • How does a Systematic Literature Review work? Prepare a guideline for computer science students explaining the main aspects and include a list of relevant publications search engines/catalogues.
  • What does "systematic" mean in SLR, how is it different from other literature review methods? How does it compare to a Structured Literature Review? How does it compare to a Systematic Mapping Studies? What are risks and limitations of the method?
  • How are research questions represented/quantified? What does coding mean in this context?
  • How are search queries constructed? Explain the technique of query expansion for generating additional queries.
  • Which SLR artifacts should be provided to allow for reproducibility and replicability?
  • What tools exist to support SLRs? Demonstrate a suitable tool.

Literature:

  • Kitchenham, B. (2004). Procedures for Undertaking Systematic Reviews. https://www.inf.ufsc.br/~aldo.vw/kitchenham.pdf
  • Kitchenham, B., Pearl Brereton, O., Budgen, D., Turner, M., Bailey, J., & Linkman, S. (2009). Systematic literature reviews in software engineering - A systematic literature review. Information and Software Technology, 51(1), 7–15.
  • Brereton, P., Kitchenham, B. a., Budgen, D., Turner, M., & Khalil, M. (2007). Lessons from applying the systematic literature review process within the software engineering domain. Journal of Systems and Software, 80(4), 571–583.
  • Petersen, K., Vakkalanka, S., & Kuzniarz, L. (2015). Guidelines for conducting systematic mapping studies in software engineering: An update. Information and Software Technology, 64, 1–18.
  • Díaz, O., Medina, H., & Anfurrutia, F. I. (2019). Coding-Data Portability in Systematic Literature Reviews. Proceedings of the Evaluation and Assessment on Software Engineering - EASE ’19, 178–187.
  • Khadka, R., Saeidi, A. M., Idu, A., Hage, J., & Jansen, S. (2013). Legacy to SOA Evolution: A Systematic Literature Review. In A. D. Ionita, M. Litoiu, & G. Lewis (Eds.), Migrating Legacy Applications: Challenges in Service Oriented Architecture and Cloud Computing Environments (pp. 40–71). IGI Global.
  • Jamshidi, P., Ahmad, A., & Pahl, C. (2013). Cloud Migration Research: A Systematic Review. IEEE Transactions on Cloud Computing, 1(2), 142–157.
  • Rai, R., Sahoo, G., & Mehfuz, S. (2015). Exploring the factors influencing the cloud computing adoption: a systematic study on cloud migration. SpringerPlus, 4(1), 197.
  • A. Hinderks, F. José, D. Mayo, J. Thomaschewski and M. J. Escalona, "An SLR-Tool: Search Process in Practice : A tool to conduct and manage Systematic Literature Review (SLR)," 2020 IEEE/ACM 42nd International Conference on Software Engineering: Companion Proceedings (ICSE-Companion), 2020, pp. 81-84.

Questions:

  • What does the term "Hearing Impairment" mean?
  • What are user requirements from People with Hearing Impairments for using Web Applications (User Interface, modalities, controls)? Do these requirements differ according to the specific disability?
  • Have a look at navigation applications as a Use Case for the following research tasks. (As a typical application with a voice based User Interface)
  • Write guidelines for accessible Web Applications concerning People with Hearing Impairments. Compare Web Applications with these guidelines (e.g. Goolge Maps or OpenStreetMap)
  • Build a demonstration application where you implement your guidelines.
  • Compare your application with other Web Applications like Google Maps or OpenStreetMap in the context of your guidelines and evaluate them.

Literature:

  • own research
  • Web Content Accessibility Guidelines (WCAG) 2.1 https://www.w3.org/TR/WCAG21/
  • Diverse Abilities and Barriers | Web Accessibility Initiative (WAI) | W3C https://www.w3.org/WAI/people-use-web/abilities-barriers/
  • Maya Gupta, Ali Abdolrahmani, Emory Edwards, Mayra Cortez, Andrew Tumang, Yasmin Majali, Marc Lazaga, Samhitha Tarra, Prasad Patil, Ravi Kuber, and Stacy M. Branham. 2020. Towards More Universal Wayfinding Technologies: Navigation Preferences Across Disabilities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3313831.3376581 https://dl.acm.org/doi/pdf/10.1145/3313831.3376581
  • Giakoumis, D., Kaklanis, N., Votis, K. et al. Enabling user interface developers to experience accessibility limitations through visual, hearing, physical and cognitive impairment simulation. Univ Access Inf Soc 13, 227–248 (2014). https://doi.org/10.1007/s10209-013-0309-0 https://link.springer.com/article/10.1007/s10209-013-0309-0
  • German UPA - Accessibility - Universal Design https://germanupa.de/sites/default/files/public/content/2018/2018-03-27/160721fsbarrierefreiheitenbfpdfua.pdf

Questions:

  • What does the term "Cognitive Impairment" mean?
  • What are user requirements from People with Cognitive Impairments for using Web Applications (User Interface, modalities, controls)? Do these requirements differ according to the specific disability?
  • Have a look at navigation applications as a Use Case for the following research tasks. (As a typical application with a high cognitive load for using it)
  • Write guidelines for accessible Web Applications concerning People with Cognitive Impairments. Compare Web Applications with these guidelines (e.g. Goolge Maps or OpenStreetMap).
  • Build a demonstration application where you implement your guidelines.
  • Compare your application with other Web Applications like Google Maps or OpenStreetMap in the context of your guidelines and evaluate them.

Literature:

  • own research
  • Web Content Accessibility Guidelines (WCAG) 2.1 https://www.w3.org/TR/WCAG21/
  • Diverse Abilities and Barriers | Web Accessibility Initiative (WAI) | W3C https://www.w3.org/WAI/people-use-web/abilities-barriers/
  • Making Content Usable for People with Cognitive and Learning Disabilities - W3C Working Group Note 29 April 2021 https://www.w3.org/TR/coga-usable/
  • Maya Gupta, Ali Abdolrahmani, Emory Edwards, Mayra Cortez, Andrew Tumang, Yasmin Majali, Marc Lazaga, Samhitha Tarra, Prasad Patil, Ravi Kuber, and Stacy M. Branham. 2020. Towards More Universal Wayfinding Technologies: Navigation Preferences Across Disabilities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3313831.3376581 https://dl.acm.org/doi/pdf/10.1145/3313831.3376581
  • Giakoumis, D., Kaklanis, N., Votis, K. et al. Enabling user interface developers to experience accessibility limitations through visual, hearing, physical and cognitive impairment simulation. Univ Access Inf Soc 13, 227–248 (2014). https://doi.org/10.1007/s10209-013-0309-0 https://link.springer.com/article/10.1007/s10209-013-0309-0
  • German UPA - Accessibility - Universal Design https://germanupa.de/sites/default/files/public/content/2018/2018-03-27/160721fsbarrierefreiheitenbfpdfua.pdf

Questions:

  • What is empirical Software Engineering Evaluation and how can it be done?
  • Why is evaluation important in the research process?
  • What is the difference between a qualitative and quantitative evaluation? (When do you use which one? What are advantages and disadvantages?)
  • Prepare a list of evaluation methods and tools that can be used to evaluate software. Explain them and add relevant literature for these methods.
  • Demonstrate one quantitative and one qualitative method.

Literature:

  • Own research
  • Creswell, J. W. (2014). Research design : Qualitative, quantitative, and mixed methods approaches (4. ed., in). SAGE. https://katalog.bibliothek.tu-chemnitz.de/Record/0008891954
  • Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., & Wesslén, A. (2012). Experimentation in Software Engineering. In Experimentation in Software Engineering (Vol. 9783642290). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-29044-2
  • Chatzigeorgiou, A., Chaikalis, T., Paschalidou, G., Vesyropoulos, N., Georgiadis, C. K., & Stiakakis, E. (2015). A Taxonomy of Evaluation Approaches in Software Engineering. Proceedings of the 7th Balkan Conference on Informatics Conference - BCI ’15, 1–8. https://doi.org/10.1145/2801081.2801084
  • Wainer, J., Novoa Barsottini, C. G., Lacerda, D., & Magalhães de Marco, L. R. (2009). Empirical evaluation in Computer Science research published by ACM. Information and Software Technology, 51(6), 1081–1085. https://doi.org/10.1016/j.infsof.2009.01.002

Questions:

  • When can the description of a research dataset be considered "good"?
  • Which research data properties are essential in terms of interdisciplinary reusability?
  • How can we measure or evaluate the quality of research data descriptions?

Questions:

  • What is digital twins?
  • What are the current solutions within the Internet of Things domain?
  • How can a digital twin for an IoT device be created?

Literature:

    Questions:

    • What are the differnent types of approaches for extracting structured information from unstructured text
    • Compare the existing approaches in terms of performance?
    • What is the best state of the art solution for achieving this?
    • What are some of the existing European proved health discharge summaries?

    Literature:

      Questions:

      • What is Trust Evaluation?
      • Where does Trust Evaluation fit into Computing with Trust?
      • What are types of typical Trust Evaluation Metrics and how do they calculate?
      • How can you make Trust Metrics attack-resistant?

      Questions:

      • What does Accessibility in the Web mean?
      • What are general guidelines for accessible Web Interfaces? Name and explain them.
      • What tools are available to evaluate accessibility? Compile a list of them and their use cases.
      • Show with an example how accessibility in the Web can be checked and improved.

      Literature:

      • Own research
      • Web Content Accessibility Guidelines (WCAG) 2.1 https://www.w3.org/TR/WCAG21/
      • Diverse Abilities and Barriers | Web Accessibility Initiative (WAI) | W3C https://www.w3.org/WAI/people-use-web/abilities-barriers/
      • German UPA - Accessibility - Universal Design https://germanupa.de/sites/default/files/public/content/2018/2018-03-27/160721fsbarrierefreiheitenbfpdfua.pdf

      Seminar Opening

      The date and time of the seminar opening meeting will be announced via OPAL.

      Short Presentation

      The date and time of the short presentations will be announced via OPAL.

      In your short presentation, you will provide a brief overview on your selected topic.

      This includes the following aspects:

      1. What is in your topic?
      2. Which literature sources did you research so far?
      3. What is your idea for a demonstration?

      Following your short presentations, the advisors will provide you with feedback and hints for your full presentations.

      Hints for your Presentation

      • As a rule of thumb, you should plan 2 minutes per slide. A significantly higher number of slides per minute exceeds the perceptive capacity of your audience.
      • Prior to your presentation, you should consider the following points: What is the main message of my presentaion? What should the listeners take away?
        Your presentation should be created based on these considerations.
      • The following site provides many good hints: http://www.garrreynolds.com/preso-tips/

      Seminar Days

      The date and time of the seminar opening meeting will be announced via OPAL.

      Report

      • Important hints on citing:
        • Any statement which does not originate from the author has to be provided with a reference to the original source.
        • "When to Cite Sources" - a very good overview by the Princeton University
        • Examples for correct citation can be found in the IEEE-citation reference
        • Web resources are cited with author, title and date including URL and Request date. For example:
          • [...] M. Nottingham and R. Sayre. (2005). The Atom Syndication Format - Request for Comments: 4287 [Online]. Available: http://www.ietf.org/rfc/rfc4287.txt (18.02.2008).
          • [...] Microsoft. (2015). Microsoft Azure Homepage [Online]. Available: http://azure.microsoft.com/ (23.09.2015).
          • A url should be a hyperlink, if it is technically possible. (clickable)
      • Further important hints for the submission of your written report:
        • Use apart from justifiable exceptions (for instance highlight of text using <strong>...</strong>) only HTML elements which occur in the template. The CSS file provides may not be changed.
        • Before submitting your work, carefully check spelling and grammar, preferably with software support, for example with the spell checker of Microsoft Word.
        • Make sure that your HTML5 source code has no errors. To check your HTML5 source code, use the online validator of W3.org
        • For submission compress all necessary files (HTML, CSS, images) using a ZIP or TAR.GZ.

      Review

      • Each seminar participant has to review exactly three reports. The reviews are not anonymous.
      • Use the review forms provided in the VSR Seminar Workflow, one per report.
      • Following the review phase, each seminar participant will receive the three peer reviews of his or her report and, if necessary, additional comments by the advisors. You will then have one more week to improve your report according to the received feedback.
      • The seminar grade will consider the final report.
        All comments in the reviews are for improving the text and therefore in the interest of the author.

      Press Articles