Home Page

Although not the primary focus of the lab, the Compositional Systems Lab at the University of Arizona occasionally develops mobile applications that permit research in model-based design and development, as well as provide societal benefit. As these projects mature, they are available from this website.

Mobile Device Software: Model-Based Architectures and Examples (ECBS 2013 Tutorial)

This tutorial is a step-by-step approach to understanding software used by mobile devices, within the context of model-based design. Mobile device apps are widely viewed as a success, but even successful developers who attempt to program these devices using their existing knowledge may be daunted by the new terminology and unclear starting points. This is due to the informal nature of the documentation, which is potentially advantageous to novice coders, but which can be frustrating to experienced coders, who want to know how to map their current knowledge to these new platforms and APIs.

This tutorial focuses on performing this mapping, and explicitly aims to help make these devices and their APIs accessible in terms of the high-level models that govern their behaviors and many of their designs. Participants who want to learn about the high-level software concepts in mobile device programming, and how those concepts map to canonical UML models will benefit from the presentation of those concepts as part of the tutorial.

The following list of topics will be covered and feature prominently during the tutorial:

  1. Common mobile software patterns. Model-View- Controller, Delegate, and other design patterns.
  2. UML-based mobile software models. The majority of the documentation in both Android and iOS uses only informal models. The tutorial will repackage many high-level architectures as UML models, which makes them more accessible.
  3. Mapping common behaviors to architectures and patterns. Nearly everyone has seen how mobile apps can utilize gestures and respond to screen rotation information. This section will focus on what portions of the design implement these desired behaviors.
  4. Utilizing sensors. GPS, accelerometers, gyroscopes, cameras: all of these sensors are accessed through various design patterns, and this tutorial will discuss those patterns within the context of UML diagrams.
  5. Data models Using the data models as prescribed by the iOS and Android APIs means the ability to rapidly use various visualization and editing classes. This portion of the tutorial will discuss how such designs can be used to rapidly take advantage of these built-in features.
  6. Integration with software synthesis tools. The final portion of the tutorial addresses how domain-specific modeling and other tools can be used to synthesize some (or all) of the code required to go from models to mobile devices.

HistoryView: Seeing the past through today's lens.

HistoryView is an app that permits users to see photographs, while standing in the place where those photographs were taken. The visualization technique permits a user to line up the photo with what comes through the camera. This work was supported by a grant from the University of Arizona's Confluence Center.

The Precariat: An Intellectual History and Digitally Enhanced Learning

This grant studies the history of the emergent class, the precariat, and to develop an interactive mobile application to inform others about their findings. The work mixes innovative technologies with new knowledge in a way that is informative and engaging. The precariat are people whose lives and finances are precariously impacted by economic shifts, social insecurity and globalization.

Why mobile? This work is intended to allow people to explore ideas around the topic, with their own devices (rather than through a structured webpage). What enables this work to succeed at scale is the advent of model-based techniques, that allow Prof. Ren to specify the relationships between media, and topics they convey, and Prof. Sprinkle and his lab to transform those relationships and media into a fully formed application.

Localization


Fig.2 Data fusion principle in this design

App Summary

Location information can be useful for vehicle operators / autonomous vehicle, however, most localizing system may be large and expensive (costing more than several thousands dollars, for example, NovAtelís SPAN-CPT system is over $ 20,000). Ubiquitous smart mobile devices such as mobile phones incorporate commodity MEMS sensors (GPS, accelerometer and gyroscope) that can be an alternative approach for building localizing system, although the accuracy of these sensors is not expected to be able to compete against those inside the high price device. An optimal localizing device has to be a multi-sensor fusion system. Thus, a closed-loop indirect linearized Kalman filter for GPS-IMU integration is designed to achieve this objective. The Kalman filter estimates and then compensate the localizing error in its output (As Fig.2 shows). This design is implemented on the iPhone 4 platform recently. In other words, we build up a GPS-IMU fusion algorithm for vehicle localizing. This design integrates the similar working principle of those expensive device, while keeping its hardware cost really low.


Table 1. Performance Summary

Localizing Performance

A road test was conducted to assess the App's performance. The platform, iPhone 4, was securely mounted in the front of a vehicle, which then drove along a rectangular test path of approximately 7.2km. Table 1 lists out the comparison between the App results and the GPS direct output from iPhone. The results are achieved without measuring the location of the iPhone in the body-fixed vehicle reference frame. In sum, the App improves the azimuth estimation, lateral positioning, and frequency of localization updates without loss of original GPS-level accuracy in longitudinal direction.

iPhen: The Vegetation Index and Phenology Mobile Collection Application

iPhen is an inexpensive and portable mobile device based solution to the challenge of collecting field data about vegetation growing season as it adjusts and responds to climate change. Vegetation dynamic and phenology are key environmental parameters related to food security, carbon budget, and climate change. Change to these parameters is thought to be a precursor to larger and more ominous change to the Earth system as a whole (food chain), thus, understanding and quantifying the rate and direction of these changes are keys to addressing the larger questions surrounding global climate change.

Over the last three decades, space borne instruments provided the bulk of data to support this field of research. Spectral vegetation indices (VIs), computed from these data are strongly correlated to green vegetation amount, health, and activity. Collecting VI data over time leads to the development of time series that are useful to change detection, for tracking the impact of climate change, and to improving ecosystem models. However, these satellite-based data records are very noisy and contain large uncertainties. The validation of these data records is usually expensive with limited spatial and temporal coverage. With iPhen we address some of these challenges and take advantage of the widespread use of mobile devices. The application is aimed at citizen scientists who are motivated by the current global climate change debate.

By using this crowdsourcing technique, it is possible to sample from many more data points (for a much finer-grained snapshot of vegetation growth) than with traditional remote sensing techniques, at a fraction of the equipment cost. For more information on the scientific research performed by the VIP lab, visit their website at http://measures.arizona.edu/.

More information:

  • Sean Whitsitt, Armando Barreto, Sundaresh Ram, Hussain Al-Helal, Maribel Hudson, Diyang Chu, Jonathan Sprinkle and Kamel Didan. "Crowdsourcing in Support of Vegetation Dynamic and Phenology Research." Presentation at Fourth Annual Phenology Research and Observations of Southwest Ecosystems (PROSE) Symposium, 2010. (Cite)