loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Vinamra Agrawal and Anandha Gopalan

Affiliation: Department of Computing, Imperial College London, 180 Queens Gate, London SW7 2AZ, U.K.

Keyword(s): Neural Networks, Machine Learning, Energy Efficiency, Sustainable Computing, Green Computing.

Abstract: Artificial Intelligence is increasingly being used to improve different facets of society such as healthcare, education, transport, security, etc. One of the popular building blocks for such AI systems are Neural Networks, which allow us to recognise complex patterns in large amounts of data. With the exponential growth of data, Neural Networks have become increasingly crucial to solve more and more challenging problems. As a result of this, the computational and energy requirements for these algorithms have grown immensely, which going forward will be a major contributor to climate change. In this paper, we present techniques to reduce the energy use of Neural Networks without significantly reducing their accuracy or requiring any specialised hardware. In particular, our work focuses on Cascading Neural Networks and reducing the dimensions of the input space which in turn allows us to create simpler classifiers which are more energy-efficient. We reduce the input complexity by using semantic data (Colour, Edges, etc.) from the input images and systematic techniques such as LDA. We also introduce an algorithm to efficiently arrange these classifiers to optimise gain in energy efficiency. Our results show a 13% reduction in energy usage over the popular Scalable effort classifier and a 35% reduction when compared to Keras CNN for Cifar10. Finally, we also reduced energy usage of the full input neural network (often used as the last stage in the cascading technique) by using Bayesian optimisation with adjustable parameters and minimal assumptions to search for the best model under given energy constraints. Using this technique we achieve significant energy savings of 29% and 34% for MNIST and Cifar10 respectively. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.14.250.187

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Agrawal, V. and Gopalan, A. (2020). Energy Optimisation of Cascading Neural-network Classifiers. In Proceedings of the 9th International Conference on Smart Cities and Green ICT Systems - SMARTGREENS; ISBN 978-989-758-418-3; ISSN 2184-4968, SciTePress, pages 149-158. DOI: 10.5220/0009565201490158

@conference{smartgreens20,
author={Vinamra Agrawal. and Anandha Gopalan.},
title={Energy Optimisation of Cascading Neural-network Classifiers},
booktitle={Proceedings of the 9th International Conference on Smart Cities and Green ICT Systems - SMARTGREENS},
year={2020},
pages={149-158},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0009565201490158},
isbn={978-989-758-418-3},
issn={2184-4968},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Smart Cities and Green ICT Systems - SMARTGREENS
TI - Energy Optimisation of Cascading Neural-network Classifiers
SN - 978-989-758-418-3
IS - 2184-4968
AU - Agrawal, V.
AU - Gopalan, A.
PY - 2020
SP - 149
EP - 158
DO - 10.5220/0009565201490158
PB - SciTePress