Automated translation of Android context-dependent gestures to visual GUI test instructions.
Riccardo CoppolaLuca ArditoMarco TorchianoPublished in: A-TEST@ESEC/SIGSOFT FSE (2021)
Keyphrases
- context dependent
- low level
- semantic level
- high level
- visual information
- context free
- hidden markov models
- test cases
- user centric
- visual features
- machine translation
- user friendly
- gesture recognition
- natural language
- mobile devices
- graphical user interfaces
- visual representation
- cross language information retrieval
- mobile applications
- context sensitive