The tremendous growth in wireless subscriber base coupled with increased use
of data applications are making the decision makers at telcos realize the
importance of investing in state-of-the-art T&M equipment. On the other
hand, the emergence of 2.5G and 3G wireless networks meant to support high-speed
data traffic offers new challenges. TRAI's recent paper found most of the
telecom service providers wanting in the QoS parameters. It may be recalled that
TRAI had set a timeframe 36 months for mobile operators and 48 months for fixed
line operators which has got over and is planning to review QoS parameters.
Technology Trends
Software-based solutions gain Prominence: T&M equipment manufacturers
are placing greater emphasis on software to make measurements easier to use,
faster, and more accurate. Since these software-based testing equipment use
PC-based platforms, the same hardware can be used to test any of the standards
like GSM, GPRS or CDMA by simply changing the software. In other words, PC is
most likely to become the standard T&M instrument hardware in the near
future with the integration of software testing tools into it.
These solutions are easy to configure (according to user requirements), use
and also come at a lower cost. More than that, for example, National Instruments
promoted the concept of 'virtual instrumentation' allows users to design and
configure their own instruments according to their requirements using software
tools. On the other hand solutions like Agilent Technologies' new wireless
application measurement software (WAMS), the central component of its data
service assurance program, enables RF engineering teams to automate data
traffic, quantify wireless data user experience and find the cause of problems
through leading visualization and analysis tools, thereby reducing the cost of
wireless data rollout.
WAMS enables RF engineers to build complex serial and parallel test sequences
via an easy-to-use and customizable interface. Tests are independent of wireless
technology and include multimedia message service (MMS), short message service (SMS),
HTTP, FTP, e-mail, wireless-application protocol (WAP), and PING.
Portable T&M instruments: T&M companies are making handheld
T&M devices with more functions and power. For examples, service providers
can expect a growing number of portable analyzers that can meet the needs of
burgeoning field applications. What is good about these instruments is that even
though they are portable they are powerful and feature-packed. Also, many
T&M equipment manufacturers are trying to harness the power and portability
of handheld computers. For example, Noah Industries (Melbourne, FL) offers a
fiber-optic power meter that ties into PDAs and provides graphs of the fiber's
performance.
Realtime RF analyzers: Tektronix has come up with a new series of
real-time spectrum analyzers that provides the ability to trigger, capture and
analyze time-varying RF signals. RF signal characteristics are becoming more
complex as RF communications increasingly replace wired technologies in
applications ranging from inventory identification to video games. Today's RF
signals carry complex modulation and change from one instant to the next,
hopping frequencies, spiking briefly, and then disappearing. As a result, these
RF signals are difficult to measure and present unpredictable behavior, making
engineers' ability to observe RF devices with existing spectrum analyzers
extremely challenging. This evolution in RF technology has spawned an
unprecedented demand for a new approach to spectrum analysis. Realtime spectrum
analysis has emerged as the vehicle to address this RF technology evolution.
- Digital
technology is becoming all-pervasive. Whether it is communication systems,
control systems, radar systems or any other, almost all the analog blocks
are being replaced by digital. This is bringing profound changes in the area
of test tools that are required. Signal Analyzers are required to have
digital demodulation capability while the signal sources are required to
create signals with digital modulation formats. Mixed Signal Oscilloscopes
and Logic Analyzers have become a tool of choice for debugging digital
designs.
-
Whether digital or analog, the designs are moving to
increasingly higher frequencies. In the digital domain, in order to
accommodate higher data rates, the signal buses are changing to
"serial" from the traditional "parallel" ones. This
brings new and unforeseen challenges for design engineers and manufacturers.
Signal integrity has become a critical factor for the success of digital
designs and we see traditional microwave tools like Vector Network Analyzers
as being used to decide the robustness of design of the physical layer.
-
Induction of new technologies such as VoIP, MPLS, WLAN,
UWB etc. demand suitable new design and validation tools.
Tips
Compliance with ITU and International Recommendations: For most T&M
solutions, the specifications are laid out by the international bodies like ITU-T,
ETSI and ANSI for accuracy and consistency. The buyer must ensure that the
instrument complies with these standards. Since the specifications for the
instrument are much tighter than the equipment or device to be tested, their
compliance with the international body's recommendations is essential. If not,
the results may lose accuracy and repeatability in the measurement
Portability
and Ruggedness: The equipment should be easy to carry and should have easy
and accessible connections. It should be rugged enough to support wide range of
temperature, jerk and vibrations, and humidity conditions.
Technology support: The instrument should be such that it supports all
the relevant technologies for testing. It means that if the instrument is
designed for the transmission technology testing based on optical fiber then it
should support the currently prevailing technology like SDH, POS and ATM on SDH
together in one box. This allows the buyer for easy testing in the technology
mixed environment and also allows the testing in the area of technology
migration.
Application fitness: The instrument should not be so much general
purpose to overkill the investment of the buyer. The instrument should target
the correct market segment. For example, there are certain tests that are a must
in field installation and maintenance and these are the key concern areas of the
network operator. The instrument must be able to do all the tests needed by this
segment, but need not include the entire lot of tests required in manufacturing
and R&D. This will result in value for money for the buyer. Operators do not
use R&D and manufacturing types of tests as they come in the area of factory
acceptance test (FAT).
Scalability:
The instrument should be such that it is able to scale to higher bandwidth,
interface, and testing requirement. It should also be possible to upgrade the
instrument without hardware and software limitations so that the investment
remains protected in the evolving technology. For e.g., GPRS and 3G technologies
are still in the growth stage and have not yet matured. Modifications are
happening in these technologies and the instrument should be able to upgrade to
the changing technology and protocols.
Pre- and after-sales support: The company should have a direct
presence in India so that they can advise and optimize the testing and test
times. There should be a strong technical support team in the T&M supplier
company so that instant solutions can be found while troubleshooting.
Repair and uptime support: The T&M supplier should have a standard
calibration, repair and support center so that support of the order of 24/7 can
be obtained.
Solution level from company: The T&M company should have the core
competence in the area of test and measurement and should not be a non-focus
segment in the company. Also the company should be able to provide the test and
measurement solution in all the segments right form R&D to operations. Such
companies are able to offer the perfect fit for the T&M solution to the
market.
Deployment trends
Next-generation SDH networks will be designed on MPLS technology platform.
VoIP will become popular for voice communication. With the increasing demand on
bandwidth from corporate and SOHO users, both access and long haul networks
would be required to provide broadband capability. Service providers would
increasingly depend on OSS/BSS systems in order to maintain higher QoS and grow
their revenue and profit.
The
complexity of test systems is driven and defined by the complexity of the
product that is being tested. There is a rapid convergence of varied
technologies into single products and thus there is an increasing onus being
placed on the systems that test them. A perfect example is cellular phones. When
one considers phones from a few years ago, voice was the primary application,
whereas now it is just one among an array of features like camera, video,
Bluetooth, WLAN, gaming, MP3 and video players to name a few.
We see that cellular phones have evolved from being just RF circuitry to
being computing platforms. This means that the phone is a software enabled
device, i.e. a device whose functionality is dictated primarily by the software
that is present on it. And if we look around, we see more and more of such
devices. New features are embedded at an amazing rate and as a result, the
testing of such devices is increasingly challenging. Traditional based
approaches fall well short of the mark when testing such devices. Adding a new
vendor-defined box to test each new feature is ineffective, too cumbersome, and
inflexible.
Often, there isn't a specific instrument built yet to test cutting edge
functionality.
Looking at the example of a GPS device that are found in most modern cars, we
can see the huge number of traditional instruments required to test such a
device:
The disadvantages of such an approach are clearly visible. Also, any
additional features that need to be tested leave no option but to add another
device.
The virtual instrumentation approach offers a far more effective and
efficient way to solve problems involving the testing of such devices. The only
way to keep up with the changes in design of the devices under test is to employ
a software oriented approach in the test system, and this is the governing
principle behind Virtual Instrumentation.
|
A virtual instrumentation system is thus, a predominantly software-defined
system, where software based on user requirements defines the functionality of
generic, modular measurement hardware.
A similar testing application using the virtual Instrumentation approach can
be visualized as seen below:
This approach to testing mirrors the software oriented model of the devices
themselves, and this represents a significant trend in the test and measurement
industry.
The way forward
The way forward is definitely to bring about user defined functionality test
and measurement equipment, and this is one of the unique features of the virtual
instrumentation approach.
It is imperative that the test instrument vendors keep pace with all these
changes. In fact they have to always move ahead in close collaboration of the
global OEMs/ NEMs so that they have the right tools available for validating the
design of the next generation products.
From a hardware perspective, there should be a strong focus on driving the
overall cost down by leveraging commercial technology in computing and
semiconductor technology. From the software front, the way forward is to develop
software that can abstract the system complexity of modern applications by
providing a powerful yet productive means to test even the most complex and
changing technology.
|
Testing systems based on the virtual instrumentation approach is clearly the
path that the industry is taking. This type of a solution is acknowledged as the
only way to effectively meet the emerging demands of modern test and measurement
applications. Evidence of this is seen from the attitudes of leading
manufacturers as well as consumers.
User issues
Convergence of RF & digital technologies is constantly challenging the
users. They need to rapidly update cross-domain knowledge in order to survive
and contribute in their work. Sometimes, users need to work on new technologies
where standards are yet to be frozen. In this situation the right quality test
tools may not be available.
Rapid
changes in components, devices and hardware technology is reducing the life
cycle of the test instruments. This means customers today need to upgrade sooner
than earlier days.
From the user's perspective, if traditional instrumentation were used to
solve such an application, there would be challenges like a very high number of
devices with a large overall footprint, no inbuilt synchronization platform/bus
to share data between different tests, as well as having to replicate hardware
to test additional functions.
If virtual instrumentation is employed, users have to undergo familiarity
with the approach and the tools involved.
Challenges
The multiplicity of new devices connecting to the wireline and wireless
networks as well as the functionality that they possess is truly astounding. The
greatest challenge faced by the T&M industry is to be able to keep pace and
to effectively give customers ways to test these new devices and technologies.
To manage the cost of telecom networks, many service providers are becoming
extremely sensitive to each and every cost input due to which sometimes quality
of tools is getting compromised. This needs to be addressed immediately so that
the network can continue to deliver quality service. As customers become more
familiar with technology, they would become more demanding and the only way out
for service providers is to improve QoS.
Shrinking life cycle of devices, components and hardware platforms challenge
the test instrument vendors to provide longer support as preferred by customers.
Trends and Drivers
The spending on T&M as part of the network costs is dismally low in
India. While globally, the cost of T&M equipment is 5-8 percent of the total
network cost, in India, the figure is as low as 0.1 percent. Indications are
that the T&M spend will go up in the coming years. More so because the new
service providers are still in the rollout phase. Companies are still seeing
T&M investment as expenditure. However, in future, they will see it as an
investment as they will be required to differentiate on quality, better customer
service, reliable networks and increased productivity.
Earlier, sanctions were a major hindrance in the growth and propagation of
T&M. It must be remembered here that the major suppliers of T&M
equipment are global vendors, primarily those from the US. And the easing of the
US sanctions since October 2001 has enabled them to sell a much broader range of
products.
Service providers bought spectrum analyzers, network analyzers, protocol
analyzers, optical time domain reflectometers (OTDR), optical spectrum analyzers
(OSA) and plesiochronous digital hierarchy and synchronous digital hierarchy (PDH/SDH)
analyzers and handheld testers. BSNL and MTNL were the big customers and they
will continue to remain too as they expand their network further. Private GSM
and CDMA operators too are a huge market for these products as most of them are
talking about improving quality of their network. Further, the thrust today is
on teledensity. As a result, the emphasis will now be on quality of services. So
the potential market for T&M equipment is likely to scale up. The signs are
potent now. Before deregulation, T&M vendors were totally dependent on the
government type of projects.
Primarily, the L1 business dominated the T&M equipment. It meant bulk
purchase of hardware where vendors had no choice but to quote the lowest. Things
have changed now after the entry of the private sector and corporatization of
BSNL. There has been rationalization in procurement. Another noticeable trend is
in solutions. Large vendors are pushing solutions rather than boxes.
|