Skip to main navigation menu Skip to main content Skip to site footer

Semi-Automatic Mapping Technique Using Snowballing to Support Massive Literature Searches in Software Engineering

Abstract

Systematic literature reviews represent an important methodology in Evidence-Based Software Engineering. To define the methodological route in these type of studies, in which a review of quantitative and qualitative aspects of primary studies is carried out to summarize the existing information regarding a particular topic, researchers use protocols that guide the construction of knowledge from research questions. This article presents a process that uses forward Snowballing, which identifies the articles cited in the paper under study and the number of citations as inclusion criteria to complement systematic literature reviews. A process that relies on software tools was designed to apply the Snowballing strategy and to identify the most cited works and those who cite them. To validate the process, a review identified in the literature was used. After comparing the results, new works that were not taken into account but made contributions to the subject of study emerged. The citation index represents the number of times a publication has been referenced in other documents and is used as a mechanism to analyze, measure, or quantitatively assess the impact of said publication on the scientific community. The present study showed how applying Snowballing along with other strategies enables the emergence of works that may be relevant for an investigation given the citations rate. That is, implementing this proposal will allow updating or expanding systematic literature studies through the new works evidenced.

Keywords

citation impact, evidence-based software engineering, massive literature searches, snowballing, software engineering, systematic mapping

PDF

Author Biography

Elizabeth Suescún-Monsalve

Roles: Supervision, investigation, writing -original draft, writing - revision and edition.

Julio-Cesar Sampaio-do-Prado-Leite

Roles: Supervision, investigation, writing - revision and edition.

César-Jesús Pardo-Calvache

Roles: Supervision, investigation, writing - revision and edition.


References

  1. T. Dyba, B. A. Kitchenham, M. Jorgensen, “Evidence-based software engineering for practitioners,” IEEE Software., vol. 22, no. 1, pp. 58–65, 2005. https://doi.org/10.1109/MS.2005.6 DOI: https://doi.org/10.1109/MS.2005.6
  2. M. Jorgensen, T. Dyba, B. Kitchenham, “Teaching evidence-based software engineering to university students,” in 11th International Software Metrics Symposium, 2005, p. 8. https://doi.org/10.1109/METRICS.2005.46 DOI: https://doi.org/10.1109/METRICS.2005.46
  3. B. A. Kitchenham, T. Dyba, M. Jorgensen, “Evidence-based software engineering,” in Proceedings. 26th International Conference on Software Engineering, 2004, pp. 273–281. https://doi.org/10.1109/ICSE.2004.1317449 DOI: https://doi.org/10.1109/ICSE.2004.1317449
  4. T. Dybå, T. Dingsøyr, G. K. Hanssen, “Applying systematic reviews to diverse study types: An experience report,” in Proceedings 1st International Symposium on Empirical Software Engineering and Measurement, 2007, no. 7465, pp. 225–234. https://doi.org/10.1109/ESEM.2007.59 DOI: https://doi.org/10.1109/ESEM.2007.59
  5. B. Barn, S. Barat, T. Clark, “Conducting systematic literature reviews and systematic mapping studies,” in 10th Innovations in Software Engineering Conference, 2017. https://doi.org/10.1145/3021460.3021489 DOI: https://doi.org/10.1145/3021460.3021489
  6. K. Petersen, R. Feldt, S. Mujtaba, M. Mattsson, “Systematic mapping studies in software engineering,” in 12th International Conference on Evaluation and Assessment in Software Engineering (EASE), 2008. https://doi.org/10.14236/ewic/EASE2008.8 DOI: https://doi.org/10.14236/ewic/EASE2008.8
  7. B. Martin, J. Irvine, “Assessing basic research,” Research policy, vol. 12, no. 2, pp. 61–90, 1983. https://doi.org/10.1016/0048-7333(83)90005-7 DOI: https://doi.org/10.1016/0048-7333(83)90005-7
  8. W. A. Chapetta, G. H. Travassos, “Towards an evidence-based theoretical framework on factors influencing the software development productivity,” Empirical Software Engineering, vol. 25, no. 5, pp. 3501–3543, 2020. https://doi.org/10.1007/s10664-020-09844-5 DOI: https://doi.org/10.1007/s10664-020-09844-5
  9. C. Wohlin, E. Papatheocharous, J. Carlson, K. Petersen, E. Alégroth, J. Axelsson, D. Badampudi, M. Borg, A. Cicchetti, F. Ciccozzi, T. Olsson, S. Sentilles, M. Svahnberg, K. Wnuk, T. Gorschek, “Towards evidence‐based decision‐making for identification and usage of assets in composite software: A research roadmap,” Journal of Software: Evolution and Process, vol. 33, no. 6, e2345, 2021. https://doi.org/10.1002/smr.2345 DOI: https://doi.org/10.1002/smr.2345
  10. L. Shanshan, H. Zhang, Z. Jia, C. Zhong, C. Zhang, J. Shen, M Babar, “Understanding and addressing quality attributes of microservices architecture: A Systematic literature review,” Information and software technology, vol. 131, e106449, 2021. https://doi.org/10.1016/j.infsof.2020.106449 DOI: https://doi.org/10.1016/j.infsof.2020.106449
  11. V. Garousi, D. Pfahl, J.Fernandes, M. Felderer, M. Mäntylä, D. Shepherd, A. Arcuri, A. Coşkunçay, B. Tekinerdogan, “Characterizing industry-academia collaborations in software engineering: evidence from 101 projects,” Empirical Software Engineering, vol. 24, no. 4, pp. 2540–2602, 2019. https://doi.org/10.1007/s10664-019-09711-y DOI: https://doi.org/10.1007/s10664-019-09711-y
  12. E. Souza, A. Moreira, M. Goulão, “Deriving architectural models from requirements specifications: A systematic mapping study,” Information and software technology, vol. 109, pp. 26–39, 2019. https://doi.org/10.1016/j.infsof.2019.01.004 DOI: https://doi.org/10.1016/j.infsof.2019.01.004
  13. J. Barros, F. Pinciroli, S. Matalonga, N. Martínez-Araujo, “What software reuse benefits have been transferred to the industry? A systematic mapping study,” Information and Software Technology, vol. 103, pp. 1-21, 2018. https://doi.org/10.1016/j.infsof.2018.06.003 DOI: https://doi.org/10.1016/j.infsof.2018.06.003
  14. T. Ribeiro, J. Massollar, G. H. Travassos, “Challenges and pitfalls on surveying evidence in the software engineering technical literature: an exploratory study with novices,” Empirical Software Engineering, vol. 23, no. 3, pp. 1594–1663, 2018. https://doi.org/10.1007/s10664-017-9556-7 DOI: https://doi.org/10.1007/s10664-017-9556-7
  15. M. Felderer, J. C. Carver, “Guidelines for systematic mapping studies in security engineering,” in Empirical Research for Software Security, 2017, pp. 47–68. https://doi.org/10.48550/arXiv.1801.06810 DOI: https://doi.org/10.1201/9781315154855-2
  16. K. Petersen, S. Vakkalanka, L. Kuzniarz, “Guidelines for conducting systematic mapping studies in software engineering: An update,” Information and software technology, vol. 64, pp. 1–18, 2015. https://doi.org/10.1016/j.infsof.2015.03.007 DOI: https://doi.org/10.1016/j.infsof.2015.03.007
  17. V. Garousi, A. Rainer, “Gray literature versus academic literature in software engineering: A call for epistemological analysis,” IEEE Software, vol. 38, no. 5, pp. 65–72, 2021. https://doi.org/10.1109/MS.2020.3022931 DOI: https://doi.org/10.1109/MS.2020.3022931
  18. X. Zhou, “How to treat the use of grey literature in software engineering,” in Proceedings of the International Conference on Software and System Processes, 2020. https://doi.org/10.1145/3379177.3390305 DOI: https://doi.org/10.1145/3379177.3390305
  19. V. Garousi, M. Felderer, M. V. Mäntylä, “Guidelines for including grey literature and conducting multivocal literature reviews in software engineering,” Information and software technology, vol. 106, pp. 101–121, 2019. https://doi.org/10.1016/j.infsof.2018.09.006 DOI: https://doi.org/10.1016/j.infsof.2018.09.006
  20. A. Williams, “Using reasoning markers to select the more rigorous software practitioners’ online content when searching for grey literature,” in Proceedings of the 22nd International Conference on Evaluation and Assessment in Software Engineering, 2018. https://doi.org/10.1145/3210459.3210464 DOI: https://doi.org/10.1145/3210459.3210464
  21. E. Mourão, J. Pimentel, L. Murta, M. Kalinowski, E. Mendes, C. Wohlin, “On the performance of hybrid search strategies for systematic literature reviews in software engineering,” Information and software technology, vol. 123, no. 1, e106294, 2020. https://doi.org/10.1016/j.infsof.2020.106294 DOI: https://doi.org/10.1016/j.infsof.2020.106294
  22. Y. Shakeel, J. Krüger, I. von Nostitz-Wallwitz, O. von Guericke, C. Lausberger, G. Campero, G. Saake, T. Leich, “(Automated) literature analysis - threats and experiences,” in 13th International Workshop on Software Engineering for Science, 2018, pp. 20–27. https://doi.org/10.1145/3194747.3194748 DOI: https://doi.org/10.1145/3194747.3194748
  23. D. Carrizo, J. Manriquez, “Impact of assessment of empirical studies reliability: A revisited study,” in 37th International Conference of the Chilean Computer Science Society, 2018. https://doi.org/10.1109/SCCC.2018.8705250 DOI: https://doi.org/10.1109/SCCC.2018.8705250
  24. E. Hassler, D. Hale, J. Hale, “A comparison of automated training-by-example selection algorithms for Evidence Based Software Engineering,” Information and Software Technology, vol. 98, pp. 59–73, 2018. https://doi.org/10.1016/j.infsof.2018.02.001 DOI: https://doi.org/10.1016/j.infsof.2018.02.001
  25. C. Wohlin, R. Prikladnicki, “Systematic literature reviews in software engineering,” Information and software technology, vol. 55, no. 6, pp. 919–920, 2013. https://doi.org/10.1016/j.infsof.2017.12.004 DOI: https://doi.org/10.1016/j.infsof.2013.02.002
  26. C. Wohlin, “Guidelines for snowballing in systematic literature studies and a replication in software engineering,” in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, 2014. https://doi.org/10.1145/2601248.2601268 DOI: https://doi.org/10.1145/2601248.2601268
  27. E. Mendes, K. Felizardo, C. Wohlin, M. Kalinowski, “Search strategy to update systematic literature reviews in software engineering,” in 45th Euromicro Conference on Software Engineering and Advanced Applications, 2019. https://doi.org/10.1109/SEAA.2019.00061 DOI: https://doi.org/10.1109/SEAA.2019.00061
  28. E. Mendes, C. Wohlin, K. Felizardo, M. Kalinowski, “When to update systematic literature reviews in software engineering,” Journal of Systems and Software, vol. 167, e110607, 2020. https://doi.org/10.1016/j.jss.2020.110607 DOI: https://doi.org/10.1016/j.jss.2020.110607
  29. V. Nepomuceno, S. Soares, “On the need to update systematic literature reviews,” Information and software technology, vol. 109, pp. 40–42, 2019. https://doi.org/10.1016/j.infsof.2019.01.005 DOI: https://doi.org/10.1016/j.infsof.2019.01.005
  30. E. Mourao, M. Kalinowski, L. Murta, E. Mendes, C. Wohlin, “Investigating the use of a hybrid search strategy for systematic reviews,” in International Symposium on Empirical Software Engineering and Measurement, 2017. https://doi.org/10.1109/ESEM.2017.30 DOI: https://doi.org/10.1109/ESEM.2017.30
  31. P. Singh, K. Singh, “Exploring automatic search in digital libraries: A caution guide for systematic reviewers,” in Proceedings of the 21st International Conference on Evaluation and Assessment in Software Engineering, 2017. https://doi.org/10.1145/3084226.3084275 DOI: https://doi.org/10.1145/3084226.3084275
  32. V. Nepomuceno, S. Soares, “Maintaining systematic literature reviews: Benefits and drawbacks,” in Proceedings of the 12th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, 2018. https://doi.org/10.1145/3239235.3267432 DOI: https://doi.org/10.1145/3239235.3267432
  33. B. Kitchenham, L. Madeyski, P. Brereton, “Meta-analysis for families of experiments in software engineering: a systematic review and reproducibility and validity assessment,” Empirical Software Engineering, vol. 25, no. 1, pp. 353–401, 2020. https://doi.org/10.1007/s10664-019-09747-0 DOI: https://doi.org/10.1007/s10664-019-09747-0
  34. Z. Li, “Stop building castles on a swamp! The crisis of reproducing automatic search in evidence-based software engineering,” in 43rd International Conference on Software Engineering: New Ideas and Emerging Results, 2021. https://doi.org/10.1109/ICSE-NIER52604.2021.00012 DOI: https://doi.org/10.1109/ICSE-NIER52604.2021.00012
  35. Z. Yu, N. A. Kraft, T. Menzies, “Finding better active learners for faster literature reviews,” Empirical Software Engineering, vol. 23, no. 6, pp. 3161–3186, 2018. https://doi.org/10.1007/s10664-017-9587-0 DOI: https://doi.org/10.1007/s10664-017-9587-0
  36. N. Ali, M. Usman, “Reliability of search in systematic reviews: Towards a quality assessment framework for the automated-search strategy,” Information and Software Technology, vol. 99, pp. 133–147, 2018. https://doi.org/10.1016/j.infsof.2018.02.002 DOI: https://doi.org/10.1016/j.infsof.2018.02.002
  37. S. Barat, T. Clark, B. Barn, V. Kulkarni, “A model-based approach to systematic review of research literature,” in 10th Innovations in Software Engineering Conference, 2017. https://doi.org/10.1145/3021460.3021462 DOI: https://doi.org/10.1145/3021460.3021462
  38. J. C. Carver, E. Hassler, E. Hernandes, N. A. Kraft, “Identifying barriers to the systematic literature review process,” in International Symposium on Empirical Software Engineering and Measurement, 2013. https://doi.org/10.1109/ESEM.2013.28 DOI: https://doi.org/10.1109/ESEM.2013.28
  39. L. Madeyski, B. Kitchenham, “Would wider adoption of reproducible research be beneficial for empirical software engineering research?,” Journal of Intelligent & Fuzzy Systems, vol. 32, no. 2, pp. 1509–1521, 2017. https://doi.org/10.3233/JIFS-169146 DOI: https://doi.org/10.3233/JIFS-169146
  40. V. Nepomuceno, S. Soares, “Avoiding plagiarism in systematic literature reviews: An update concern,” in Proceedings of the 14th International Symposium on Empirical Software Engineering and Measurement, 2020. https://doi.org/10.1145/3382494.3422170 DOI: https://doi.org/10.1145/3382494.3422170
  41. C. Wohlin, A. Rainer, “Challenges and recommendations to publishing and using credible evidence in software engineering,” Information and software technology, vol. 134, e106555, 2021. https://doi.org/10.1016/j.infsof.2021.106555 DOI: https://doi.org/10.1016/j.infsof.2021.106555
  42. S. Pizard, F. Acerenza, X. Otegui, S. Moreno, D. Vallespir, B. Kitchenham, “Training students in evidence-based software engineering and systematic reviews: a systematic review and empirical study,” Empirical Software Engineering, vol. 26, no. 3, pp. 1-53. 2021. https://doi.org/10.1007/s10664-021-09953-9 DOI: https://doi.org/10.1007/s10664-021-09953-9
  43. T. Menzies. M. Shepperd, “‘Bad smells’ in software analytics papers,” Information and software technology, vol. 112, pp. 35–47, 2019. https://doi.org/10.1016/j.infsof.2019.04.005 DOI: https://doi.org/10.1016/j.infsof.2019.04.005
  44. J. Pérez, J. Díaz, J. Garcia-Martin, B. Tabuenca, “Systematic literature reviews in software engineering—enhancement of the study selection process using Cohen’s Kappa statistic,” Journal of Systems and Software, vol. 168, e110657, 2020. https://doi.org/10.1016/j.jss.2020.110657 DOI: https://doi.org/10.1016/j.jss.2020.110657
  45. B. Kitchenham, L. Madeyski, D. Budgen, J. Keung, P. Brereton, S. Charters, S. Gibbs, A. Pohthong, “Robust statistical methods for empirical software engineering,” Empirical Software Engineering, vol. 22, no. 2, pp. 579–630, 2017. https://doi.org/10.1007/s10664-016-9437-5 DOI: https://doi.org/10.1007/s10664-016-9437-5
  46. V. Garousi, A. Rainer, M. Felderer, M. V. Mäntylä, “Introduction to the Special Issue on: Grey Literature and Multivocal Literature Reviews (MLRs) in software engineering,” Information and software technology, vol. 141, no. 1, e106697, 2022. https://doi.org/10.1016/j.infsof.2021.106697 DOI: https://doi.org/10.1016/j.infsof.2021.106697
  47. F. Bezerra, C. H. Favacho, R. Souza, C. de Souza, Towards supporting systematic mappings studies: An automatic snowballing approach: https://bit.ly/3uIG890
  48. G. Tsafnat, P. Glasziou, M. K. Choong, A. Dunn, F. Galgani, E. Coiera, “Systematic review automation technologies,” Systematic reviews, vol. 3, no. 1, p. 74, 2014. https://doi.org/10.1186/2046-4053-3-74 DOI: https://doi.org/10.1186/2046-4053-3-74
  49. R. Montebelo, A. Orlando, D. Porto, D. Zaniro, S. Fabbri, Uma Ferramenta Computacional de Apoio à Revisão Sistemática. https://bit.ly/3uRcBd8
  50. C. Marshall, P. Brereton, “Tools to support systematic literature reviews in software engineering: A mapping study,” in International Symposium on Empirical Software Engineering and Measurement, 2013. https://doi.org/10.1109/ESEM.2013.32 DOI: https://doi.org/10.1109/ESEM.2013.32
  51. L. Yang, H. Zhang, H. Shen, X. Huang, X. Zhou, G. Rong, D. Shao, “Quality assessment in systematic literature reviews: A software engineering perspective,” Information and Software Technology, vol. 130, e106397, 2021. https://doi.org/10.1016/j.infsof.2020.106397 DOI: https://doi.org/10.1016/j.infsof.2020.106397
  52. T. M. Connolly, E. A. Boyle, E. MacArthur, T. Hainey, J. M. Boyle, “A systematic literature review of empirical evidence on computer games and serious games” Computers & education, vol. 59, no. 2, pp. 661–686, 2012. https://doi.org/10.1016/j.compedu.2012.03.004 DOI: https://doi.org/10.1016/j.compedu.2012.03.004

Downloads

Download data is not yet available.

Most read articles by the same author(s)

Similar Articles

1 2 > >> 

You may also start an advanced similarity search for this article.