tasks and ensuring that the code is complete and
consistent to design specifications.
The research presented in this paper introduces a
methodology leveraging metamodeling techniques to
dissect the UI structure, identify components, and
establish transformation rules for seamless
conversion. Through an exploration of existing
literature, the pivotal role of metamodeling in UI
development is underscored, demonstrating its
potential to enhance efficiency, adaptability, and
accuracy in both design and code generation
processes. By employing metamodeling, the mapping
process is refined, leading to the creation of precise
transformation rules for UI conversion. Moreover,
utilizing source code instead of image processing
offers comprehensive insights into design structure,
while automated code generation bolsters
productivity and minimizes errors.
The proposed research design entails three
phases: source and target models analysis,
metamodeling, and code generation. Initial data
collection involved scrutinizing export source files
from prominent UI design applications to identify
design components. Subsequently, metamodels for
UI source code and designated frameworks were
constructed, enabling the extraction of transformation
rules based on established relationships. The final
phase saw the implementation of an automated code
generator to produce application UI based on the
analyzed data and transformation rules.
In order to explore the generation of front-end
component source code we have investigated the
feasibility of model-driven principles, which were
actual a decade ago, but still are suitable in the tasks
of web development automation and gives an ability
to bridge the gap between design and development.
As the result, the solution reduces development time,
because by generating code from wireframes,
developers can focus on complex functionalities and
business logic. As well as automation can free
developers from repetitive tasks, allowing them to
invest their expertise in higher-level aspects of
application development. Moreover, the automating
code generation can bridge the gap between designers
and developers by providing a common ground for
communication and iteration. During the research,
limitations arose due to difficulties in exporting
source code from various UI design applications.
Draw.io was identified as a potential solution, yet it
also had limitations, as outlined in subsequent
sections. Despite these challenges, the proposed
solution shows promise in transforming UI
development processes. Future research will explore
leveraging Figma, a UI design tool with extensive
plug-in integration capabilities.
REFERENCES
Amankwah-Amoah, Joseph et al. (2021) “COVID-19 and
digitalization: The great acceleration.” Journal of
business research, vol. 136, 602-611.
doi:10.1016/j.jbusres.2021.08.011.
Batdalov R., Nikiforova O. (2018) “Three patterns of data
type composition in programming languages.” ACM
International Conference Proceeding Series, DOI:
10.1145/3282308.3282341
Behailu, B. (2020) Automatic Code Generation from Low
Fidelity Graphical User Interface Sketches Using Deep
Learning. Thesis: http://ir.bdu.edu.et//handle/123456
789/11292
Bouças, T. and Esteves, A. (2020) Converting Web Pages
Mockups to HTML using Machine Learning.
DOI:10.5220/0010116302170224.
Chen, J. et al. (2020) Wireframe-based UI Design Search
through Image Autoencoder, ACM Transactions on
Software Engineering and Methodology, 29(3), p. 19:1-
19:31. DOI:10.1145/3391613.
Czarnecki, K. and Helsen, S. (2006) Feature-based survey
of model transformation approaches, IBM Systems
Journal, 45(3), pp. 621–645, DOI:10.1147/sj.453.0621
Dimbisoa, W.G., Mahatody, T. and Razafimandimby, J.P.
(2018) Creating a metamodel of UI components in form
of model independant of the platform, 6(2), p. 5.
Djamasbi, S., Siegel, M. and Tullis, T. (2010) Generation
Y, web design, and eye tracking, International Journal
of Human-Computer Studies, 68(5), pp. 307–323.
DOI:10.1016/j.ijhcs.2009.12.006.
Draw.io, Diagram drawing tool, https://app.diagrams.net
Figma, The Collaborative Interface Design Tool,
https//www.figma.com
Henry, S.L., Abou-Zahra, S. and Brewer, J. (2014) The role
of accessibility in a universal web, Henry [Preprint]:
https://dspace.mit.edu/handle/1721.1/88013
Horgan D, Hackett J, Westphalen C, B, Kalra D, Richer E,
Romao M, Andreu A, L, Lal J, A, Bernini C, Tumiene
B, Boccia S, Montserrat A. (2020) Digitalisation and
COVID-19: The Perfect Storm. Biomed Hub. 2020;5:1-
23. doi: 10.1159/000511232.
Jokela, T., Ojala, J. and Olsson, T. (2015) A Diary Study on
Combining Multiple Information Devices in Everyday
Activities and Tasks, p. 3912, DOI:10.1145/2702
123.2702211.
Knuuttila, T. (2011) Modelling and representing: An
artefactual approach to model-based representation,
Studies in History and Philosophy of Science Part A,
42(2), pp. 262–271. DOI:10.1016/j.shpsa.2010.11.034.
Lallemand, C., Gronier, G. and Koenig, V. (2015) User
experience: A concept without consensus? Exploring
practitioners perspectives through an international
survey, Computers in Human Behavior, 43, pp. 35–48.
DOI: 10.1016/j.chb.2014.10.048.