“debug” and “trace” levels impractical for production
environments.
5 CONCLUSIONS
This paper presented the performance tools developed
for the STEP Framework, a Java-based framework for
Web Services and applications, but most importantly,
it presented trade-off discussions and lessons learned
that can be applied to other Web Service frameworks
dealing with performance-related issues, especially
when using the Hibernate, JAX-WS, and Log4J li-
braries.
The performance assessment used a representative
Web Service to perform experiments on: time slice
breakdown, request types, SOAP size, caching, con-
current users, and logging. Assembling a tool chain
to collect, process, and visualize the data was an ex-
tensive work, but the benefits of having it in place
are greatly valuable for developers in a learning en-
vironment and beyond. The detailed description of
the performance analysis process provides insight to
how similar techniques can be used. Some pitfalls
are stated and explained for others to avoid. With the
new framework capabilities, future work can compare
application implementation alternatives, but can also
compare diverse platforms, both physical and virtua-
lized, and providing means to compare performance
in different cloud providers. The developed perfor-
mance tool chain assists in finding solutions for per-
formance problems, one step at a time.
ACKNOWLEDGEMENTS
Miguel L. Pardal is supported by a PhD fellow-
ship from the Portuguese Foundation for Science and
Technology FCT (SFRH/BD/45289/2008).
REFERENCES
Alonso, G., Casati, F., Kuno, H., and Machiraju, V. (2004).
Web Services: Concepts, Architectures and Applica-
tions. Springer Verlag.
Bauer, C. and King, G. (2008). Java Persistence with Hi-
bernate. Manning.
Boyer, B. (2008). Robust Java benchmarking. IBM Devel-
oper Works.
Fowler, M., Rice, D., Foemmel, M., Hieatt, E., Mee, R.,
and Stafford, R. (2002). Patterns of Enterprise Appli-
cation Architecture. Addison Wesley.
Georges, A., Buytaert, D., and Eeckhout, L. (2007).
Statistically rigorous Java performance evaluation.
In 22nd annual ACM SIGPLAN conference on
Object-oriented programming systems and applica-
tions (OOPSLA), pages 57–76, New York, NY, USA.
ACM.
Jain, R. (1991). The Art of Computer Systems Performance
Analysis - Techniques for Experimental Design, Mea-
surement, Simulation, and Modeling. Wiley.
Janert, P. K. (2009). Gnuplot in Action - Understanding
Data with Graphs. Manning.
Juric, M. B., Rozman, I., Brumen, B., Colnaric, M., and
Hericko, M. (2006). Comparison of performance
of Web Services, WS-Security, RMI, and RMISSL.
Journal of Systems and Software, 79(5):689 – 700.
Machado, A. and Ferraz, C. (2006). JWSPerf: A perfor-
mance benchmarking utility with support to multiple
web services implementations. In International Con-
ference on Internet and Web Applications and Services
(ICIW), pages 159 – 159.
Montgomery, D. C. and Runger, G. C. (2010). Applied
Statistics and Probability for Engineers. Wiley.
Pardal, M., Fernandes, S., Martins, J., and Pardal, J. P.
(2008). Customizing web services with extensions in
the STEP Framework. International Journal of Web
Services Practices, 3. Issue 1.
Pearce, D. J., Webster, M., Berry, R., and Kelly, P. H. J.
(2007). Profiling with aspectj. Softw. Pract. Exper.,
37(7):747–777.
Roza, M., Schroders, M., and van de Wetering, H. (2009).
A high performance visual profiler for games. In ACM
SIGGRAPH Symposium on Video Games (Sandbox
’09), pages 103–110, New York, NY, USA. ACM.
Shankar, K. and Lysecky, R. (2009). Non-intrusive dynamic
application profiling for multitasked applications. In
46th Annual Design Automation Conference (DAC),
pages 130–135, New York, NY, USA. ACM.
IMPROVINGWEBSERVICESPERFORMANCE,ONESTEPATATIME
551