Abstract
Programming by example (PBE) is a technology that makes data transformation tasks, especially tabular data transformation, easier for data analysts by automatically generating transformation programs from user-given input–output examples. In recent years, PBE research using machine learning (ML) has emerged because of the recent success of ML in various research fields. We developed an ML-based PBE system for tabular data transformation in previous work. The system is based on the Transformer model that fits sequential data and not two-dimensional structured data like tabular data. Inspired by recent work applying a Transformer model to tasks using two-dimensional data, such as the query answering task for tables or the image computer vision task, we propose a Transformer-based model with positional encoding for two-dimensional tabular data, called tabular positional encoding, to improve the Transformer-based model developed in our previous work. We implemented our proposed model and conducted various experiments. The experimental results show that the Transformer-based model with tabular positional encoding achieves much higher performance than our previous work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
This example is cited from our previous work [7].
- 2.
- 3.
- 4.
References
Dosovitskiy, A., et al.: An image is worth \(16\times 16\) words: transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020)
Herzig, J., Nowak, P.K., Müller, T., Piccinno, F., Eisenschlos, J.M.: TaPas: weakly supervised table parsing via pre-training. arXiv preprint arXiv:2004.02349 (2020)
Jin, Z., Anderson, M.R., Cafarella, M., Jagadish, H.: Foofah: transforming data by example. In: Proceedings of the 2017 ACM International Conference on Management of Data, pp. 683–698 (2017)
Ott, M., et al.: fairseq: a fast, extensible toolkit for sequence modeling. In: Proceedings of NAACL-HLT 2019: Demonstrations (2019)
Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, vol. 32, pp. 8024–8035. Curran Associates, Inc. (2019). http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
Raman, V., Hellerstein, J.M.: Potter’s wheel: an interactive data cleaning system. In: VLDB, vol. 1, pp. 381–390 (2001)
Ujibashi, Y., Takasu, A.: Neural network approach to program synthesis for tabular transformation by example. IEEE Access 10, 24864–24876 (2022). https://doi.org/10.1109/ACCESS.2022.3155468
Vaswani, A., et al.: Attention is all you need. arXiv preprint arXiv:1706.03762 (2017)
Wang, Z., Liu, J.C.: Translating math formula images to latex sequences using deep neural networks with sequence-level training. Int. J. Doc. Anal. Recogn. (IJDAR) 24(1), 63–75 (2021)
Acknowlegments
This work was supported by the Cross-ministerial Strategic Innovation Promotion Program (SIP) Second Phase and “Big-data and AI-enabled Cyberspace Technologies” by New Energy and Industrial Technology Development Organization (NEDO).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ujibashi, Y., Takasu, A. (2022). Two-Dimensional Encoding Method for Neural Synthesis of Tabular Transformation by Example. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13532. Springer, Cham. https://doi.org/10.1007/978-3-031-15937-4_27
Download citation
DOI: https://doi.org/10.1007/978-3-031-15937-4_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-15936-7
Online ISBN: 978-3-031-15937-4
eBook Packages: Computer ScienceComputer Science (R0)