transactions on pattern analysis and machine intelli-
gence, 35(8):1798–1828.
Bollegala, D. and Bao, C. (2018). Learning word meta-
embeddings by autoencoding. In Proceedings of the
27th International Conference on Computational Lin-
guistics, pages 1650–1661.
Bollegala, D., Hayashi, K., and Kawarabayashi, K.-i.
(2017). Think globally, embed locally—locally
linear meta-embedding of words. arXiv preprint
arXiv:1709.06671.
Charles, D., Gabriel, M., and Furukawa, M. F. (2013).
Adoption of electronic health record systems among
us non-federal acute care hospitals: 2008-2012. ONC
data brief, 9:1–9.
Chen, Y., Perozzi, B., Al-Rfou, R., and Skiena, S. (2013).
The expressive power of word embeddings. arXiv
preprint arXiv:1301.3226.
Choi, E., Bahadori, M. T., Searles, E., Coffey, C., Thomp-
son, M., Bost, J., Tejedor-Sojo, J., and Sun, J. (2016a).
Multi-layer representation learning for medical con-
cepts. In Proceedings of the 22nd ACM SIGKDD In-
ternational Conference on Knowledge Discovery and
Data Mining, pages 1495–1504. ACM.
Choi, E., Bahadori, M. T., Song, L., Stewart, W. F., and
Sun, J. (2017). Gram: graph-based attention model for
healthcare representation learning. In Proceedings of
the 23rd ACM SIGKDD International Conference on
Knowledge Discovery and Data Mining, pages 787–
795. ACM.
Choi, E., Schuetz, A., Stewart, W. F., and Sun, J. (2016b).
Medical concept representation learning from elec-
tronic health records and its application on heart fail-
ure prediction. arXiv preprint arXiv:1602.03686.
Choi, Y., Chiu, C. Y.-I., and Sontag, D. (2016c). Learning
low-dimensional representations of medical concepts.
AMIA Summits on Translational Science Proceedings,
2016:41.
Chowdhury, S., Zhang, C., Yu, P. S., and Luo, Y. (2019).
Mixed pooling multi-view attention autoencoder for
representation learning in healthcare. arXiv preprint
arXiv:1910.06456.
Coates, J. and Bollegala, D. (2018). Frustratingly easy
meta-embedding–computing meta-embeddings by av-
eraging source word embeddings. arXiv preprint
arXiv:1804.05262.
Elixhauser, A. and Palmer, L. (2015). Clinical classifica-
tions software (ccs): Agency for healthcare research
and quality; 2014 [cited 2015].
Goldberg, Y. (2016). A primer on neural network models
for natural language processing. Journal of Artificial
Intelligence Research, 57:345–420.
Golub, G. H. and Reinsch, C. (1970). Singular value de-
composition and least squares solutions. Numerische
mathematik, 14(5):403–420.
Jensen, P. B., Jensen, L. J., and Brunak, S. (2012). Mining
electronic health records: towards better research ap-
plications and clinical care. Nature Reviews Genetics,
13(6):395.
Johnson, A. E., Pollard, T. J., Shen, L., Li-wei, H. L.,
Feng, M., Ghassemi, M., Moody, B., Szolovits, P.,
Celi, L. A., and Mark, R. G. (2016). Mimic-iii, a
freely accessible critical care database. Scientific data,
3:160035.
Knake, L. A., Ahuja, M., McDonald, E. L., Ryckman,
K. K., Weathers, N., Burstain, T., Dagle, J. M., Mur-
ray, J. C., and Nadkarni, P. (2016). Quality of ehr data
extractions for studies of preterm birth in a tertiary
care center: guidelines for obtaining reliable data.
BMC pediatrics, 16(1):59.
Luo, Y., Tang, J., Yan, J., Xu, C., and Chen, Z. (2014).
Pre-trained multi-view word embedding using two-
side neural network. In AAAI, pages 1982–1988.
Ma, T., Xiao, C., Zhou, J., and Wang, F. (2018). Drug sim-
ilarity integration through attentive multi-view graph
auto-encoders. arXiv preprint arXiv:1804.10850.
Maaten, L. v. d. and Hinton, G. (2008). Visualizing data
using t-sne. Journal of machine learning research,
9(Nov):2579–2605.
Miotto, R., Li, L., Kidd, B. A., and Dudley, J. T. (2016).
Deep patient: an unsupervised representation to pre-
dict the future of patients from the electronic health
records. Scientific reports, 6:26094.
Nair, V. and Hinton, G. E. (2010). Rectified linear units
improve restricted boltzmann machines. In Proceed-
ings of the 27th international conference on machine
learning (ICML-10), pages 807–814.
Pennington, J., Socher, R., and Manning, C. (2014). Glove:
Global vectors for word representation. In Proceed-
ings of the 2014 conference on empirical methods in
natural language processing (EMNLP), pages 1532–
1543.
Roweis, S. T. and Saul, L. K. (2000). Nonlinear dimension-
ality reduction by locally linear embedding. science,
290(5500):2323–2326.
Shickel, B., Tighe, P. J., Bihorac, A., and Rashidi, P. (2018).
Deep ehr: A survey of recent advances in deep learn-
ing techniques for electronic health record (ehr) anal-
ysis. IEEE journal of biomedical and health informat-
ics, 22(5):1589–1604.
Tran, T., Nguyen, T. D., Phung, D., and Venkatesh, S.
(2015). Learning vector representation of medical ob-
jects via emr-driven nonnegative restricted boltzmann
machines (enrbm). Journal of biomedical informatics,
54:96–105.
Wold, S., Esbensen, K., and Geladi, P. (1987). Principal
component analysis. Chemometrics and intelligent
laboratory systems, 2(1-3):37–52.
Yin, W. and Sch
¨
utze, H. (2015). Learning meta-embeddings
by using ensembles of embedding sets. arXiv preprint
arXiv:1508.04257.
HEALTHINF 2020 - 13th International Conference on Health Informatics
376