Model Parallelism Optimization for Distributed DNN Inference on Edge Devices | IEEE Conference Publication | IEEE Xplore