LMM: latency-aware micro-service mashup in mobile edge computing environment.
Ao ZhouShangguang WangShaohua WanLianyong QiPublished in: Neural Comput. Appl. (2020)
Keyphrases
- computing environments
- mashup
- third party
- web applications
- web services
- service discovery
- semantic tagging
- mobile computing
- web content
- model driven
- social networking
- linked data
- pervasive computing
- rapid development
- user interface
- lightweight
- web browser
- end users
- web resources
- grid computing
- context awareness
- privacy protection
- location based services
- web apis
- data sources
- databases
- pervasive computing environments
- service oriented
- user behavior
- mobile agents
- semantic web
- data driven
- information extraction
- e learning
- artificial intelligence