En este post continúo transcribiendo algunas notas que tomé durante el curso de Human-Computer Interaction del Interaction Design Foundation, cuyo profesor es Alan Dix.
El curso, que recomiendo fuertemente, está en https://www.interaction-design.org/courses/human-computer-interaction.
Este post está relacionado a la temática de Implementación, y específicamente acerca de Paint Models y Event Models.
Observación: ¡Las notas las redacté en inglés porque el curso es en inglés!
Espero que les sirvan 🙂
Paint Models
The way in which stuff appears in the screen.
There are four major ways toolkits manage painting to screen:
- Direct to the screen – it’s the lowest level, occurs at a hardware level. It’s rare.
- Direct to screen via viewport.
- With the use of a separate buffer.
- Display list / model.
Event Models
Event models are about understanding the things the user is doing. Also, anything that is happening and that the application should know about.
Different event models such as the Notification-based event model.
Notification style affects the interface. Dialog boxes example: In the notification-based event model, if you don’t work hard, the dialog box will be non-modal by default. You must specifically tell the system not to respond to the user’s input if the user presses buttons outside of the modal dialog box.
If you are not careful the toolkit will determine your design rather than your own specific design choice.
Implementation should not drive design!
Asynchrony: events happen in any order. This makes it difficult to test because usually testers follow a testing script to see if things happen as they should.
User-directed input is not an event model for management systems. However, the “Read-evaluation loop” and “Notification-based” event models are.