As a former QA team lead within Motorola’s GSM and UMTS infrastructure teams, I’ve seen first-hand how mobile technology has evolved and how that evolution has progressed differently than other consumer based technology markets. Two significant transformations took place in the mobile markets that catapulted the slow moving, high priced technology into the cutting edge of today’s market. The first improvement was increasing the capacity of base stations. This enabled mobile service providers to break into new markets by providing a means of low-cost, rapid deployments to areas without existing affordable or reliable land-line infrastructure (think India and China). The other advancement, and the one I will focus on here, was the ability to transport data through mobile networks.
Initially, the ability to transport data was limited to slow connections, with very little throughput. Once consumers realized that they could turn their mobile phones into multi-media devices, the die was cast. Early adopters built simple, interactive games and messaging products that were focused on generating incremental revenue for the service providers.
As the technology matured, over subsequent “generations,” the data connections became larger. This led to an explosion of applications written specifically for mobile devices that took advantage of these larger data connection. Applications that used to be limited to simple games could now be interactive with other users, no matter how they’re connected to the web. New browsers, GPS integrated navigation applications, and unlimited Web 2.0 applications now have versions specifically designed and built for mobile devices.
“Designed and built for mobile devices” is a nebulous statement now. Traditionally, that has meant that the application was designed to run within the specific firmware and/or operating system, use the device’s specific user interface, and stay within the physical resource limits of the phone. In some cases, the application may have been aware enough to attempt to fool the mobile protocols into keeping the data connections open during times when no data is flowing, but that was about as far as the application’s awareness of how the network functions went.
Applications can be tested on individual devices, with live or test networks and can even be simulated to provide effective functional testing efficiently. The real problems start when you try to understand capacity and stress impacts of, say a few million users trying to access a social website through their phone. The logistics, scale, and cost of trying to test a scenario like that with a real mobile network is so prohibitive that it’s simply not done.
There is another way. With WAN emulation, you can create a test environment that, to the application under test, behaves exactly like the live network. There are software-based WAN emulation products that integrate directly into development environments for functional testing and appliance-based products that can emulate entire cities, countries, or the world topology of a mobile network. When compared to the complexity and cost of working with a live network, the setup and use of a WAN emulation solution is trivial.
Only by testing your mobile applications to scale, in a realistic environment can you understand how your application will perform once deployed. Understanding actual or potential problems can help you to know if the application needs to be changed, where the best locations are for your media servers, and how many subscribers you can support at any given time. This data is critical to the competitiveness and success of any mobile application business.
How are you currently testing your mobile applications? Do you consider latency, number of connections, and application response times? I’d love to hear about your projects or thoughts on this post.