CS226 Resit assignment Sample

CS226 resit approaching? Get expert writing support from trusted UK assignment professionals. Timely delivery, 100% custom-written.

  • 72780+ Project Delivered
  • 500+ Experts 24x7 Online Help
  • No AI Generated Content
GET 35% OFF + EXTRA 10% OFF
- +
35% Off
£ 6.69
Estimated Cost
£ 4.35
Prices Starts From
GBP 5.00GBP 9.00
10 Pages 2499 Words

Introduction: Web Design And User Experience Assignment Sample

The QuizQuest is a new quiz system whose prototype indicates great potential in being an effective tool to improve the assessment for teachers of secondary schools. For this high-fidelity prototype, Google Slides were used, and the application targets the teacher’s input, organization, and assessment of the quizzes. The goal of the project is to showcase the user-oriented approach to the design process and such features as the option to include various types of questions, scheduling of the quiz, and data analysis of the results given by the users. Regarding the goals of the efficient quizzes’ creation and the performance analysis, this report focuses on the prototype’s look, functionality, and possible usability. This report includes the detailed description and evaluation of the prototype, which demonstrates its compliance with the users’ needs and educational assessments.

Tired of Late Submissions? Expert Assignment Help UK Ensures Timely Delivery – Review Free Samples Before Ordering!

User-Centered Design Analysis

Hierarchical Task Analysis (HTA)

Figure 1: HTA Diagram

(Source: Self-Prepared)

The HTA diagram shows the hierarchy of the tasks done in the prototype.

Use Case Diagram

Figure 2: Use Case Diagram

(Source: Self-Prepared)

The Use Case Diagram shows the interaction of the teacher and the student with the prototype.

Sequence Diagram

Figure 3: Sequence Diagram

(Source: Self-Prepared)

The sequence diagram shows the sequence of the tasks done by the teacher in the prototype.

Prototype Implementation

The QuizQuest prototype implementation is the most accurate with the help of Google Slides, where a teacher in the quiz system is considered. The specifications laid down are well reflected by the features in the prototype that is developed and it encompasses a collection of tools for quiz creation, management and analysis. On signing in, teachers are introduced to a home page that comprises all quizzes and other activities concerning the quizzes (Li et al. 2021). The dashboard has links to basic operations including creating new quizzes, managing the existing quizzes, overview of the quizzes that were already created, and results of the quizzes that were taken.

Teachers are capable of creating the tests which consist of different varieties of questions and the process is well organized. The prototype supports three primary question formats: categories of questions include single choice question or option, short answer question, and hot-spot questions (Zheng et al. 2022). In single choice questions, the teacher is able to key in the question and then enter the options of the question together with the correct option. Short answer questions as the suggests permits the students to write down answers in any format that they wish, though the teachers can determine correct answers to be marked automatically. As for hot-spot questions, they contain the use of the visual part, allowing teachers to upload the images and define certain areas that, when clicked at, will be considered correct answers (Hansen and Özkil, 2020). The flexibility in the type of questions lets for a wider approach to the assessment, to different learning styles and the specific topic in question.

Quiz management features are very well developed and allow a teacher to edit or even delete quizzes; or preview them before using. The edit feature enables the changes to be made on single questions or structural changes of the quizzes to ensure that content is updated. Scheduling feature is also an important one and allows teachers to set a particular time when the quiz should be available for students, so the time the student will have to take the quiz will be controlled. Thus, this time-bound strategy covers various forms of assessment, ranging from examinations to assignments that students complete in hours. The results viewing and analysis tools give an understanding of students’ performance and doubts about the questions’ effectiveness. The platform allows the teachers to look at specific students’ answers, which are helpful when analyzing performance based on the different questions posed (Jiang et al. 2022). Also, the system compared to the previous one gives the general statistics on the performance, focusing on the most frequently missed questions and general tendencies in the class.

Design is clean, and functionality is uncluttered, with similar style being used throughout the application if not in the same area. Most action buttons are clearly labeled, and the flow of the different menus is sensible, thus making it easy for users to carry out different actions. It is possible to create quizzes and edit their forms easily, input fields are explained, and the sequence of their input is linear. The only beveled edges and gradients are absent from the design and replaced with simplicity which corresponds to the academic environment in which the application will be used. However, at the same time, it also suggests areas of improvement of the prototype which presumably addresses the fundamental functionalities that the teacher needs to accomplish in the classroom. For instance, there is no section, which involves question database and other options as a question pool or shuffling options and answers in the current design, however, these options seem essential in the future versions (Arce et al. 2022). Furthermore, although the values assigned to keymarks enable bare-bones feedback for wrong answers, other aspects may be added to the formative feedback system’s options in the future.

Therefore, it is established that the QuizQuest prototype satisfies the key requirements of the quiz system as conceptualized by the teacher viewpoint. Due to the simple and comprehensible interface, numerous features, and priority of the application’s functionality for creating assessments, it could be prescribed as a promising tool for educational evaluation. The above-mentioned basic requirements for quiz creation and management are covered in the prototype, along with analytically useful tools for data-driven teaching.

Cognitive Walkthrough Documentation

The cognitive walkthrough documentation starts with the definition of the user for the QuizQuest being a prototype at the moment. The first identified target audience corresponds to a secondary school teacher, with average computer skills, and previous experience in the usage of technological solutions in the learning process. This teacher wants to develop and deliver quizzes to students and this should be simple to fact and easily manageable. The cognitive walkthrough task scenarios include creating a new quiz, adding different types of questions to the quiz, modifying an already published quiz, scheduling a quiz for a particular class, and viewing the results of the quiz. In the action sequence for quiz creation, a user logs in subsequently goes to the teacher’s dashboard and then selects “Create Quiz”, enters the quiz details such as the quiz title, description and the quiz date and then proceeds to create quiz questions (Krishnakumar et al. 2021). The evaluator should determine if the user would know what action is to be taken at each stage, would comprehend the feedback that the system feeds back to him, and if the user has an appreciation of his progress towards the goal.

While inserting questions into the questionnaire, the user has only three options of the question type: single choice, short answer, hot-spot. The evaluator should consider whether or not the differences defining these options are obvious and whether the interface is informative enough regarding the process of generating each of the question types. In the case of single choice questions, attention should be paid to making additions of several options and indicating the correct one. The short answer type of questions depends on how easily the teacher can key in the correct answer for your program to grade. Hot-spot questions are quite complex, and the evaluator needs to weigh the obvious nature of uploading the image and defining areas that will be clickable. When it comes to the editing process, it is necessary to navigate to the “Manage Quizzes” and locate the given quiz, which can be later edited by selecting the appropriate option and making required changes. The evaluator also should analyze how the save and reuse abilities of the system stand out for the knowledge of which part is changeable (Carfagni et al. 2020). This task for organizing quizzes entails evaluating the visibility and accessibility of the feature to set the availability of quizzes between two dates.

Taking a look at the results in relation to the student’s performance involves the action sequence of getting to the ‘View Quiz Results’ area, then to a particular quiz, and finally, individual performances or statistical tests could be viewed. The evaluator should also think about how clearly this information is conveyed through the interface and whether the teachers can readily decipher the data to obtain relevant insights on students’ performance and test questions. During the identification of usability problems during the walkthrough process, any possible issues should be pointed out. Such may include the ability to provide ambiguous button or function labels and descriptions, navigational structures that do not have consistency or are substandard, and feedback to the users that is insufficient (Homeyer et al. 2021). First of all, the evaluator should focus on how to streamline the interface to help users understand such actions as creating hot-spot questions or analyzing detailed quiz results.

The documentation of the cognitive walkthrough should also comprise the system’s preventive and recuperative action against errors. This is when analyzing how effectively the interface shields against typical errors like accidentally deleting a quiz and the manner in which the errors are conveyed when they occur (Petridis et al. 2023). The evaluator should consider whether the cause or source of an error is described to the user along with advice on how the user can rectify the situation. Last, but not least, the documentation should also consider the navigational aspect of the given system, or, in other words, it is important to assess how efficiently the interface permits to switch from one task to another, for example, from the creation of quizzes to the observing of results (Kang et al. 2023). This entire façade will assist in defining any missing facets or features as well as areas through which the utilization of the online quizzes could be improved in order to augment the teacher’s efficiency in her day to day duty.

Usability Considerations

Usability is incorporated in the QuizQuest prototype through the following aspects. Improved access is also provided with clear, high contrast interfaces, and consistencies in the navigational pattern of the equipment to enable teachers of low technical expertise to master the system. Some of them are brought-in to prevent errors like confirmation dialog before deleting a quiz or validating the input before submitting the forms for creating a quiz. Working within the system, one is instantly notified about the corresponding action with the help of icons and informative messages if a certain action may prove complicated for a user. Responsive layouts guarantee the compatibility of the platform with different devices, so the teachers can operate with quizzes from PCs or tablets (Peng et al. 2022). Furthermore, the solutions utilize simple and clear icons that are accompanied by tooltips for functions’ description for inexperienced users and to improve the all-inclusive effectiveness in terms of the quiz creation and result evaluation.

Limitations and Future Improvements

Despite the effectiveness of the presented idea as the QuizQuest prototype it is essential to notice the specific features and shortcomings. There are some more features that are not implemented in the System at the moment: question bank; random questions, that allow transforming a quiz. The feedback mechanism in the prototype in case of wrong answer is simple and could be integrated with more elaborate, constructive feedback. Specifically, integration with other existing Learning Management Systems is not discussed, which means that its ease of integration into some educational systems can be an issue. Possibility of enhancement can further include creating an application that would allow the quizzes to be managed on the go, additional features that suggest quizzes according to the student’s performance, features that allow teachers to collaborate on quizzes (Schaadhardt et al. 2021). Improvements in data visualization of generated quizzes and incorporating the feature of dynamic quizzes will even boost the system's usability and education impact.

Conclusion

The QuizQuest prototype can be concluded as effectively meeting the key principles of an online quiz system being designed for educators. Since it has adopted the user-centered design approach, is easy to use, and comes with a wide variety of features, it can be considered a promising tool for the assessment of educational information. All in all, the functionality of the prototype is comprehensively complemented by the usability, providing the variety of the questions type, the freedom of the quiz setting and the depth of the results analysis. Despite the fact that there are many aspects to be improved, the current situation does not seem to be critical, and there are clear perspectives for development. In the cognitive walkthrough documentation, this points to the system’s ability whereby there is increased efficiency of the teaching process and assessment that benefits the management of the quiz, and the students.

Reference List

Journals

  • Arce, E., Suárez-García, A., López-Vázquez, J.A. and Fernández-Ibáñez, M.I., 2022. Design Sprint: Enhancing STEAM and engineering education through agile prototyping and testing ideas. Thinking Skills and Creativity, 44, p.101039.
  • Carfagni, M., Fiorineschi, L., Furferi, R., Governi, L. and Rotini, F., 2020. Usefulness of prototypes in conceptual design: students’ view. International Journal on Interactive Design and Manufacturing (IJIDeM), 14, pp.1305-1319.
  • Hansen, C.A. and Özkil, A.G., 2020. From idea to production: A retrospective and longitudinal case study of prototypes and prototyping strategies. Journal of Mechanical Design, 142(3), p.031115.
  • Homeyer, A., Lotz, J., Schwen, L.O., Weiss, N., Romberg, D., Höfener, H., Zerbe, N. and Hufnagl, P., 2021. Artificial intelligence in pathology: From prototype to product. Journal of pathology informatics, 12(1), p.13.
  • Jiang, E., Olson, K., Toh, E., Molina, A., Donsbach, A., Terry, M. and Cai, C.J., 2022, April. Promptmaker: Prompt-based prototyping with large language models. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-8).
  • Kang, B., Crilly, N., Ning, W. and Kristensson, P.O., 2023. Prototyping to elicit user requirements for product development: Using head-mounted augmented reality when designing interactive devices. Design Studies, 84, p.101147.
  • Krishnakumar, S., Berdanier, C., McComb, C. and Menold, J., 2021. Lost in translation: examining the complex relationship between prototyping and communication. Journal of Mechanical Design, 143(9), p.091402.
  • Li, J., W. Tigwell, G. and Shinohara, K., 2021, May. Accessibility of high-fidelity prototyping tools. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-17).
  • Peng, Y.H., Wu, J., Bigham, J. and Pavel, A., 2022, October. Diffscriber: Describing Visual Design Changes to Support Mixed-Ability Collaborative Presentation Authoring. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology (pp. 1-13).
  • Petridis, S., Terry, M. and Cai, C.J., 2023, April. Promptinfuser: Bringing user interface mock-ups to life with large language models. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-6).
  • Schaadhardt, A., Hiniker, A. and Wobbrock, J.O., 2021, May. Understanding blind screen-reader users’ experiences of digital artboards. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-19).
  • Zheng, C., Wang, D., Wang, A.Y. and Ma, X., 2022, April. Telling stories from computational notebooks: Ai-assisted presentation slides creation for presenting data science work. In Proceedings of the 2022 CHI Conference on Human Factors in Com
Author Bio
author-image
George Davies   rating 8 years | PhD

Computer Science papers can be difficult but I am here to help. I am George Davies from Birmingham, United Kingdom. I have obtained PhD in the same subject from a UK-reputed university. I have been a professional academic writer at New Assignment Help for the past 8 years. Students can hire me for their academic papers in computer science.

Seasonal Offer
scan qr code from mobile

Get Extra 10% OFF on WhatsApp Order

Get best price for your work

×
Securing Higher Grades Costing Your Pocket? Book Your Assignment At The Lowest Price Now!
X