Abstract
This paper addresses a challenging problem of visual inspection of transparent pharmaceutical capsules, where print registration is used to determine the capsule’s print region. The determination of the print region allows for reliable detection of defects both on the print region and on the rest of the capsule’s surface. On transparent capsules, both the print on the front and the print on the back of a capsule are concurrently visible. Moreover, the print on the back may be partially or entirely occluded by the powder inside the capsule. All this causes that the print registration methods used for opaque capsules do not achieve adequate performance. In this paper, we present a novel registration method designed specifically for transparent capsules. The method utilizes a template matching technique with a new similarity measure that considers the specific properties of transparent capsules to increase the registration robustness. Additionally, we present a registration refinement step that reduces the effect of possible print deformations and image distortions. The performance of the method was evaluated in terms of robustness, accuracy and speed on large image sets of four different radial prints. The new method shows highly improved robustness (>98.6 %) compared to the method based on normalized cross-correlation (>72 %) and the method based on feature matching (>80 %). Furthermore, the additional refinement step improves the registration accuracy. Although the execution time is raised from 3 to 11 ms, it still meets the usual speed requirements.












Similar content being viewed by others
References
Berman, A.: Reducing medication errors through naming, labeling, and packaging. J. Med. Syst. 28, 9–29 (2004)
FDA: FDA 21CFR206, Imprinting of solid oral dosage form drug products for human use, Available at: http://www.accessdata.fda.gov/, (2015)
Vasudevan, P., DelGianni, T., Robertson, W.O.: Avoiding medication mixups - Identifiable imprint codes. West. J. Med. 165, 352–354 (1996)
Podczeck, F., Jones, B.E.: Pharmaceutical Capsules. Pharmaceutical Press, London (2004)
Karloff, A.C., Scott, N.E., Muscedere, R.: A flexible design for a cost effective, high throughput inspection system for pharmaceutical capsules. In: IEEE International Conference on Industrial Technology, 2008. ICIT 2008. pp. 1 –4 (2008)
Bukovec, M., Špiclin, Ž., Pernuš, F., Likar, B.: Automated visual inspection of imprinted pharmaceutical tablets. Meas. Sci. Technol. 18, 2921–2930 (2007)
Islam, M.J., Ahmadi, M., Sid-Ahmed, M.A.: Image processing techniques for quality inspection of gelatin capsules in pharmaceutical applications. In: 10th International Conference on Control, Automation, Robotics and Vision, 2008. ICARCV 2008. pp. 862 –867 (2008)
Špiclin, Z., Likar, B., Pernuš, F.: Real-time print localization on pharmaceutical capsules for automatic visual inspection. In: 2010 IEEE International Conference on Industrial Technology (ICIT). pp. 279 –284 (2010)
Tsai, D.M., Lin, C.T.: Fast normalized cross correlation for defect detection. Pattern Recognit. Lett. 24, 2625–2631 (2003)
Grosso, E., Lagorio, A., Tistarelli, M.: Automated quality control of printed flasks and bottles. Mach. Vis. Appl. 22, 269–281 (2011)
Edwards, D.: Applications of capsule dosing techniques for use in dry powder inhalers. Ther. Deliv. 1, 195–201 (2010)
Možina, M., Tomaževič, D., Pernuš, F., Likar, B.: Real-time image segmentation for visual inspection of pharmaceutical tablets. Mach. Vis. Appl. 22, 145–156 (2011)
Derganc, J., Likar, B., Bernard, R., Tomaževič, D., Pernuš, F.: Real-time automated visual inspection of color tablets in pharmaceutical blisters. Real-Time Imaging 9, 113–124 (2003)
Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17, 790–799 (1995)
Kuglin, C., Hines, D.: The phase correlation image alignment method. IEEE Conf. Cybern. Soc. pp. 163–165 (1975)
Bracewell, R.N.: The Fourier Transform and Its Applications. McGraw-Hill Higher Education, New York (2000)
Marquardt, D.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11, 431–441 (1963)
Druckmüller, M.: Phase correlation method for the alignment of total Solar eclipse images. Astrophys. J. 706, 1605 (2009)
Padfield, D.: Masked object registration in the Fourier domain. IEEE Trans. Image Process. 21, 2706–2718 (2012)
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60, 91–110 (2004)
Rosten, E., Drummond, T.: Machine learning for high-speed corner detection. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) Computer Vision - ECCV 2006, pp. 430–443. Springer, Berlin (2006)
Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Tenth IEEE International Conference on Computer Vision (ICCV’05) Volume 1. Vol. 2, pp. 1508–1515 (2005)
Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM. 24, 381–395 (1981)
Lohmann, A.W., Weigelt, G., Wirnitzer, B.: Speckle masking in astronomy: triple correlation theory and applications. Appl. Opt. 22, 4028 (1983)
Acknowledgments
This work was supported by Sensum, Computer Vision Systems, and by the European Union, European Social Fund.
Author information
Authors and Affiliations
Corresponding author
Appendix: Efficient calculation of the similarity measures FO and BO in the Fourier domain
Appendix: Efficient calculation of the similarity measures FO and BO in the Fourier domain
The calculation of \(\hbox {FO}\) and \(\hbox {BO}\) in the Fourier domain requires the following Fourier transforms:
The equation for \(\hbox {FO}\) (6) can be rephrased as:
Each individual sum in \(\hbox {FO}\) (6), (22) can be translated to the Fourier domain using the Cross-Correlation Theorem:
where \(\star \) designates the cross-correlation operator.
To reduce the total number of the necessary Fourier transforms, the equation of \(\hbox {BO}\) can be expressed with images \(f_{\mathrm{fg}} \), h, and \(m_{\mathrm{fg}} \) (otherwise used in the definition of \(\hbox {FO})\). This can be done because the extended input image \(f\left( {x,y} \right) \) is horizontally symmetrical (\(f^{\mathrm{F}}\left( {x,y} \right) =f\left( {x,y} \right) )\), flipped background mask is equal to foreground mask (\(m_{\mathrm{bg}} ^{\mathrm{F}}\left( {x,y} \right) =m_{\mathrm{fg}} \left( {x,y} \right) )\), and because of the theorems that hold for the operation of flipping on an arbitrary image \(i\left( {x,y} \right) \) (25–27). Following the derivations (28–30), the equation for \(\hbox {BO}\) can be rephrased as depicted in (31). Using the Cross-Correlation Theorem (23), the denominator of \(\hbox {BO}\) can also be easily translated to the Fourier domain (32). On the contrary, the numerator of \(\hbox {BO}\) (\(\hbox {BO}_{\mathrm{num}} )\) is generally a special case of cross triple correlation [24] of signals \(f_{\mathrm{fg}} \left( {x,y} \right) \), \(\left( {1-h\left( {x-u,y-v} \right) } \right) ^{2}\), and \(h^{\mathrm{F}}\left( {x,y,u,v} \right) \); thus, it cannot be easily translated to the Fourier domain using only the Cross-Correlation Theorem (23). Because \(\hbox {BO}_{\mathrm{num}} \) (31) includes shifts in three different directions (u, v and \(-u)\), its calculation in the Fourier domain requires a 3D Fourier transform of dimensions \(M\times N\times M\) (M – image width, N – image height), which requires \(O(M^{2}N\log _2 M^{2}N)\) multiplications. It turns out that it is more efficient to calculate it partially in the time domain. Since horizontal flip of \(h^{\mathrm{F}}\) only affects the sums in the x direction, the correlation in the y direction can be calculated using the 1D Cross-Correlation Theorem for each pixel column at each horizontal shift u, while the sums over x are calculated in the time domain (33). \(F_y \) represents a column-wise 1D Fourier transform in the y direction. Using this formula, the computational complexity is reduced to \(O(M^{2}\left( {N\log _2 N} \right) )\). Note that the 1D Fourier transform \(\mathcal{F}_y \left\{ {\left( {1-h(x-u,y} \right) )^{2}h^{\mathrm{F}}\left( {x+u,y} \right) } \right\} \) can be pre-calculated offline for all possible shifts u.
Rights and permissions
About this article
Cite this article
Mehle, A., Bukovec, M., Likar, B. et al. Print registration for automated visual inspection of transparent pharmaceutical capsules. Machine Vision and Applications 27, 1087–1102 (2016). https://doi.org/10.1007/s00138-016-0797-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00138-016-0797-z