skip to main content
10.1145/3459104.3459152acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiseeieConference Proceedingsconference-collections
research-article

A Tensor Formalism for Computer Science

Published:20 July 2021Publication History

ABSTRACT

Over recent years, tensors have emerged as the preferred data structure for model representation and computation in machine learning. However, current tensor models suffer from a lack of a formal basis, where the tensors are treated as arbitrary multidimensional data processed by a large and ever-growing collection of functions added ad hoc. In this way, tensor frameworks degenerate to programming languages with a curiously cumbersome data model. This paper argues that a more formal basis for tensors and their computation brings important benefits. The proposed formalism is based on 1) a strong type system for tensors with named dimensions, 2) a common model of both dense and sparse tensors, and 3) a small, closed set of tensor functions, providing a general mathematical language in which higher level functions can be expressed.

These features work together to provide ease of use resulting from static type verification with meaningful dimension names, improved interoperability resulting from defining a closed set of just six foundational tensor functions, and better support for performance optimizations resulting from having just a small set of core functions needing low-level optimizations, and higher-level operations being able to work on arbitrary chunks of these functions, as well as from better mathematical properties from using named tensor dimensions without inherent order. The proposed model is implemented as the model inference engine in the Vespa big data serving engine, where it runs various models expressed in this language directly, as well as models expressed in TensorFlow or Onnx formats.

References

  1. Tensors Considered Harmful, Harvard NLP blog post by Alexander Rush http://nlp.seas.harvard.edu/NamedTensorGoogle ScholarGoogle Scholar
  2. Smarter Deep Learning with Tensor Shape Library : tsalib, Towards Data Science blog post by Nishant Sinha https://towardsdatascience.com/i ntroducing-tensor-shape-annotation-library-tsalib-963b5b13c35bGoogle ScholarGoogle Scholar
  3. AxsisArrays GitHub project home https://github.com/JuliaArrays/Axi sArrays.jlGoogle ScholarGoogle Scholar
  4. Vespa.ai home page https://vespa.aiGoogle ScholarGoogle Scholar
  5. Ranking with TensorFlow Models, Vespa documentation https://docs.v espa.ai/documentation/tensorflow.htmlGoogle ScholarGoogle Scholar
  6. Ranking with Onnx Models, Vespa documentation https://docs.vespa.a i/documentation/onnx.htmlGoogle ScholarGoogle Scholar
  7. Serving article comments using reinforcement learning of a neural net, Vespa blog post by Jon Bratseth https://medium.com/vespa/serving-article-comments-using-reinforcement-learning-of-az-neural-net-83 f7ded17e8fGoogle ScholarGoogle Scholar
  8. A standalone Java implementation on GitHub.com https://github.com/vespa-engine/vespa/tree/master/vespajlib/src/main/java/com/yaho o/tensorGoogle ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ISEEIE 2021: 2021 International Symposium on Electrical, Electronics and Information Engineering
    February 2021
    644 pages
    ISBN:9781450389839
    DOI:10.1145/3459104

    Copyright © 2021 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 20 July 2021

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited
  • Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)0

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format