Keywords

1 Introduction

Concern over the decline in Japan’s manufacturing competitiveness has increased in recent years. In particular, falsification of inspection data is a social problem that could undermine Japan’s manufacturing industry, which is founded on a dedication to high quality [1]. Product quality ensures safety and customer satisfaction with products and services, so a decline in quality reduces trust in producers and distributors, causing great economic loss. As the complexity of products and services and awareness of personal safety increase, the importance of inspection also increases. Inspection is an essential process in the certification of product quality and prevention of quality decline, so the data obtained from inspection must be appropriate and accurate. The four types of inspection dishonesty are listed below [2]:

  1. (1)

    The specified inspection is not carried out or some of the required inspection items are omitted.

  2. (2)

    The inspection results are changed or fabricated.

  3. (3)

    The inspection conditions are changed to produce a passing result.

  4. (4)

    The inspection is repeated until a passing result is obtained.

To prevent such fraudulent inspection practices, transparency in the inspection process is necessary, that is, the process must be “made visible”. Thorough “visualization” facilitates the early detection and prevention of various dishonest practices and may provide a solution that is highly effective in preventing such practices. We previously developed AI-forms, i.e., artificial-intelligence electronic forms, that interacts through speech as a solution for improving the standard work process in workplaces [3]. AI-forms is intended to increase efficiency and visibility in the inspection process and has brought about improvement at production-site workplaces with Internet of Things management systems. AI-forms improves both productivity (efficiency) and collection of operation records (visibility) by using speech interaction to achieve “ease of reading and writing”, “ease of operations such as work handover”, etc., which are often problems with conventional electronic forms. AI-forms improves the efficiency of the procedures for checking task descriptions and inputting task results based on the concept of a natural user interface (NUI). An NUI enables a person to operate a machine using his/her senses and perform actions naturally [4]. The NUI of AI-forms increases productivity by enabling hands- and eyes-free operation through speaking and listening via a light-weight intercom device that can be used for long periods without fatigue. The results of an evaluation conducted in a telecom equipment factory over a period of about two years verified a 2/3 reduction in training cost, 20% increase in productivity, and shortening of the worker skill-improvement cycle by a factor of about 40 [3]. For “visibility”, the effectiveness of AI-forms in enabling real-time collective management of inspection results was also demonstrated. For example, the starting and ending times of the task results input by speech are automatically appended to the inspection data as “when” data. This makes it possible to visualize the variance in task time that is latent in the standard work procedures and enables rapid understanding of what should be improved to increase efficiency. To further improve efficiency and reduce dishonesty, we previously proposed an identity-verification method that uses a hearable device (terminal that integrates earphones and a microphone) for AI-forms to enable natural acquisition of “by who” information [5]. Using the hearable device as an inter-com makes it possible to verify personal identity with an ear-authentication technique, which measures the unique acoustic characteristics of a person’s ear canal when sound from the headphone is reflected and picked up by the microphone. This technique makes it possible to capture “by who” information in addition to “when” information, which enables visualization of the work time for each worker and provides a means of increasing efficiency. It also prevents impersonation and provides a measure against inspection being carried out by unqualified workers, which has recently become a problem [5]. Clarifying the responsibility for inspection data makes it possible to prevent failure in performing inspection or omission of inspection items due to dishonesty. Although the hearable device is an excellent NUI, the technology is at the prototype stage and is not yet widely available on the market. Because the prevention of dishonest inspection is an urgent issue that requires immediate response in the workplace, it is necessary to use other devices and means that are widely available. Sound reflected within the ear canal can be recorded and saved for use in validating the ear-authentication results. For use in confirming inspection results and traceability, the recorded data should be easy to check by humans. To meet these requirements, we propose an identity-verification method with face recognition to be used with AI-forms. The face-recognition function of our method can be implemented with the camera of a smartphone. AI-forms can also be implemented with a smartphone. Because the image of a worker’s face can be recorded for confirmation, the authentication result can easily be visually checked by a human, providing excellent confirmability for product traceability.

The proposed method involves integrating face recognition into a worker’s workflow, i.e., introducing a security mechanism to human computer interaction. Though our method should be compared with a conventional method that does not include face recognition, the testing methodology and conditions are not self-evident. Therefore, we plan to evaluate the proposed method with the following testing steps: (1) feasibility tests for identity verification of workers, (2) large-scale tests for identity verification of workers, and (3) large-scale tests for comparing our method with a conventional method.

The remainder of the paper is as follows. In Sect. 2, we introduce current identity-verification methods that include face recognition. In Sect. 3, we present our proposed identity-verification method for applying face recognition to AI-forms after describing issues concerning identity verification in the workplace, objectives of identity verification through face recognition, and an outline of our smartphone app that integrates our identity-verification method with AI-forms. We also explain the three testing steps of our evaluation plan. In Sect. 4, we report on the results of conducting preliminary feasibility tests for the first step of our evaluation plan regarding the face-recognition accuracy of 99 photos taken under various conditions. In Sect. 5, we discuss the feasibility and problems with our proposed method. In Sect. 6, we consider future issues. We conclude the paper in Sect. 6.

2 Related Work

Legally verifying a person’s identity requires verifying two points: that “the person actually exists (reality)” and “the person is who he/she claims to be (identity)” [6]. In Japan, the foundation of reality is the family registry. There are limited situations in which “reality” needs to be strictly confirmed, but there are many situations in which “identity” needs to be verified. This personal-authentication confirmation is called “identity” and is often used in the same sense as verification. Current personal authentication can be done using three methods: (1) knowledge certification using information only the person in question knows, such as a password or personal identification number, (2) possession of certification, i.e., such as an ID card or driver’s license, and (3) biometric authentication by confirming a person’s fingerprints, face, etc. Although knowledge certification and possession of certification are widely used, such as at ATMs and in e-commerce, identity can be transferrable with both methods when the knowledge and certification are lent out or stolen. Therefore, neither can be effective in preventing individuals from impersonating others. Many anticipate that biometric authentication can be a means of solving these problems [7, 8]. One advantage of biometric authentication is that there is no risk of biological information being lost or forgotten. Also, biometric authentication can be considered a means to prevent individuals from impersonating others because it involves using person-specific biological information. Biometric authentication verifies identity by matching preregistered biometric information and collation information obtained through a sensor. For example, both vein authentication used in financial institutions [9] and fingerprint authentication used in national and local governments [10] require dedicated biometric-information sensors. For identity verification in the workplace, it is necessary for workers to register biometric information and be able to verify it at various locations, indoors and outdoors. It is therefore desirable to use a sensor that is as highly portable and widely available as possible. Additional requirements are ease of confirmation in the workplace and traceability for stored recorded data. Face recognition can be accomplished using the camera of a smartphone or other such devices that can be easily handled by workers. Practicality of operation [11] and verification testing [12] regarding accuracy have been reported, and many face-recognition software and application programs have been developed [13].

We previously developed a system of verifying the identity of ticket holders at large-scale events using face recognition, which is called the Ticket ID System [17]. Its effectiveness was demonstrated at over 100 concerts [18]. The face-recognition software that the Ticket ID System uses is the high-speed and high-precision commercial product NeoFace [15]. NeoFace exhibited the highest performance evaluation in the Face Recognition Vendor Test 2014 conducted by the U.S. National Institute of Standards and Technology (NIST) [14]. NeoFace achieved the lowest false reject rate (FRR) of 0.3% in processing the passport/visa photo image database at a false accept rate (FAR) of 0.1% for the NIST personal identity searches [16]. The face-recognition process is outlined in Fig. 1. In this process, registration photos are compared with collation photos to determine whether they show the same person [16]. The Ticket ID System compares registered photos of applicants with collation photos of individuals entering the event venue. First, face detection is executed by detecting and processing the facial areas for each photo. Next the facial-feature points of the detected areas, e.g., the eyes, nose, and mouth edges, are processed to carry out facial-point detection. Finally, the obtained facial-point positions are used to normalize the size and positions of the facial areas and measure their similarity between a registered and collation photo during the collation process. When the similarity measure exceeds a certain threshold, the face recognition is regarded as successful. When NeoFace is implemented in a commercially available tablet terminal, the recognition result is displayed with regard to the facial-photo information of 100,000 people within about 0.5 s [17]. We developed a prototype system that uses attendees’ selfies as input photos for face recognition, which succeeded in simplifying ID equipment by only requiring smartphone cameras [19]. Face recognition is also expected to be effective for identity verification in the workplace.

Fig. 1.
figure 1

Outline of face-recognition process

3 Methods

3.1 Verification of Worker’s Identity in Workplaces

Figure 2 shows a conventional verification method of worker’s identity and operation-record visualization that are used in the workplace. Initial identity verification is personal identification conducted by a supervisor or other responsible individuals by visually checking a worker’s photograph on a certificate of qualification or ID card. On-demand identity verification is the same type of personal identification conducted when workers are working. The verification is required during shift changes. After the verification, workers input their names to AI-forms by tapping their names on the panel, as shown in Fig. 3. The verification should be carefully conducted because a worker may erroneously tap the name of another person.

Fig. 2.
figure 2

Conventional verification method of worker’s identity and operation-record visualization with AI-form

Fig. 3.
figure 3

Worker name selection for AI forms

Operation-record visualization is provided with AI-forms, which makes it possible for a supervisor or productivity analyst to verify operation-records, such as product IDs, working process and time, and workers’ names, for checking the productivity of workers and confirmability for product traceability [3]. AI-forms provides a speech interface as a means of improving the standard work process in workplaces by making operations more efficient and visualizing processes. AI-forms improves production efficiency and visualizes the collected operation records by enhancing the readability and writability of records and handover operations that are not sufficiently supported by traditional electronic forms.

The initial and on-demand verification method is effective against impersonation and work being done by an unqualified person, but it is not efficient because of the time needed to confirm each person. In workplaces where many workers cooperate in performing a task, work efficiency declines when the time required for worker identification is long. If more people are assigned to carry out worker identification, cost becomes a problem. An identity-verification method that effectively and efficiently prevents impersonation while suppressing cost is needed. The use of an ID card reader or a device for inputting at identification number can be considered an efficient identity-verification method to prevent work being started unless a person has been identified as qualified. As noted in Sect. 2, however, identification by certification of knowledge or possession may not effectively prevent impersonation because either can be passed on to other people. Other identity-verification methods that are highly effective in preventing impersonation are required. Biometric authentication is effective against impersonation, but the cost of registering authentication data and installing special sensors and devices at all work sites are implementation problems. Biometric authentication using fingerprints, iris patterns, blood vessel patterns, etc. requires special sensors, and the registered data and comparison data cannot be easily confirmed by humans, so ease of confirmation is another problem. As mentioned above, we previously developed an identity-verification method that uses a hearable device to enable ear authentication [5]. However, this method has problems concerning currently low adoption rate as a device and ease of confirming ear authentication, so its introduction to the workplace will have to wait. As also noted above, the main issue for identity verification in the workplace is to efficiently and effectively prevent impersonation at low cost and in a way that can be easily confirmed by humans.

3.2 Identity Verification by Face Recognition

To improve verification of worker identity and operation-record visualization in workplaces, we propose an identity-verification method that includes face recognition to be used with AI-forms, as shown in Fig. 4. Face recognition is used for initial and on-demand identity verification of workers instead of a supervisor or other responsible individuals, which is integrated with AI-forms [3]. AI-forms stores workers’ facial images for operation-record visualization and confirmability for product traceability.

Fig. 4.
figure 4

Proposed identity-verification method of workers identity and operation-record visualization

Specifically, we aim at achieving the functions listed below using NeoFace for selecting worker names in AI-forms.

  1. (1)

    Efficient identity verification

    With NeoFace running on a smartphone, face recognition processing can be completed within 0.4 s [17]. Replacing the worker selection of a name with face recognition enables efficient identity verification while maintaining the functionality of AI-forms.

  2. (2)

    Effective prevention of impersonation

    Face recognition is effective protection against impersonation because lending, transferring, or misplacing is extremely unlikely. NeoFace provides highly accurate recognition and has a long track record of practical use [15, 16]; thus, it is a highly effective means of preventing impersonation.

  3. (3)

    Low cost and ease of visual confirmation

    Face recognition is carried out using a photo of a worker’s face taken with a smartphone camera as a collation photo by using a smartphone app. Because AI-forms is also implemented as a smartphone app, it is not necessary to introduce new equipment at the worksite, so this function can be implemented at low cost. Traceability is also excellent because facial images can be confirmed visually. Even if a worker falsifies a name, the facial image can be checked against the registered photo, so a high deterrence can be expected. The workplace identity-verification methods described in the previous section are compared in Table 1.

    Table 1. Comparison of identity-verification methods

3.3 AI-Forms

The concept of AI-forms is illustrated in Fig. 5 [3]. AI-forms is an interactive application composed of an AI-form class, virtual tray, and speech-interaction controller. The worker starts AI-forms in the workplace, as shown in Fig. 6, selects a name on the screen, as shown in Fig. 3, and creates or modifies the form. The AI-form class is an object definition that implements the reading and writing of form data by speech interaction with the worker, controls the workflow, including work handover, pausing, and restarting of work, and measures the duration of each work procedure according to a standard task definition. The AI-form class stores the form data and work status in internal states and has functions for dialog-scenario control and time measurement. An AI-form instance corresponds to the execution state of a standard task, which is to say a single form. The user creates an AI-form class by registering a work procedure and dialog scenario described in the specified Excel® format in the system. The virtual tray provides the user with a visualization of the AI-form class, AI-form instance of the uncompleted task, and AI-form instance of the completed task. By visualizing the task status, the virtual tray displays a list of the names of the registered forms (classes) of the AI-form class and displays the uncompleted and completed task trays. By managing the AI-form instance, the virtual tray also manages a “current tray” for each AI-form instance and changes the “current tray” to the uncompleted task tray if the task is interrupted or to the completed task tray if the task is completed. If an AI-form instance in the uncompleted task tray is specified by speech interaction or a touch operation when restarting a task, the task status is changed to in-progress and the task is restarted. Also, AI-form instance management involves managing combinations of AI-form instances and workers or items (product components), so the handing of a task over to other workers or order-memos attached to such items is implemented by a combination-change operation. The speech-interaction controller implements task-flow control by speech interaction such as the creation of AI-form instances, reading and writing of form data for AI-form instances that have been saved, and task handover, pausing, and restarting. In speech-interaction control, the user creates an AI-form instance by speaking an AI-form class name and sets the AI-form instance to the execution state. In the execution state, the AI-form instance controls speech synthesis for reading of the task procedure, automatically records the task results by speech recognition, repeats the recognition result, and measures the working time based on the AI-form class definition. The workflow can also be controlled by speaking control words such as ‘pause’ and ‘restart’, and an AI-form instance can be placed in or removed from a virtual tray.

Fig. 5.
figure 5

Conceptual model of artificial-intelligence electronic forms (AI-forms) with speech interaction

Fig. 6.
figure 6

Workers using AI-forms at manufacturing workplace

3.4 Identity Verification by Face Recognition in AI-Forms

Our proposed identity-verification method works together with AI-forms through the installation of the NeoFace face recognition software on a smartphone on which AI-forms is installed. As described in Sect. 3.3, AI-forms enables form-data reading and writing by speech interaction with a worker as well as task-flow control such as work handover, task interruption, and task restarting, and total time accounting for each work procedure. Identity verification is carried out at the time of work handover and task restarting as well as just at the beginning of the task. The process of identity verification is shown in Fig. 7. A worker opens the identity-verification app, which takes a photo of his or her face and collates the photo with the registered photo. The acquired image of the worker’s face is recorded in AI-forms regardless of the recognition results. The identity-verification screen is shown on the left of Fig. 8. If the identification is successful, the screen shown on the right of Fig. 8 is displayed. The worker is then authenticated as being qualified for the task and begins the task using AI-forms. The name of the authenticated worker is recorded in AI-forms. If the identification fails, the worker can select his/her name from the conventional list of workers. The name of the worker that is selected is also recorded. The worker’s name can be visually checked with his/her facial photos by a supervisor, as shown in Fig. 4.

Fig. 7.
figure 7

Flowchart of verification process

Fig. 8.
figure 8

Screenshots of face recognition (left) and AI-forms (right)

3.5 Evaluation Plan

The proposed method should be evaluated through comparison with a conventional method that does not include face recognition from the view-points of identity verification and visual confirmability. The main issue for identity verification in the workplace is to efficiently and effectively prevent impersonation at low cost and in a way that can be easily confirmed by humans. The following testing steps are planned for such an evaluation:

  • Step 1: Feasibility tests for identity verification of workers

    There are no findings on applying recognition technology for identity-verification of workers in actual workplace environments. First, it is necessary to clarify parameters of face recognition for this purpose through feasibility tests before applying the proposed method for large-scale tests. It is also necessary to examine what type of facial photos can be acquired for face recognition in certain environments. Feasibility check for visual confirmability, i.e., whether a supervisor or other responsible individuals can recognize workers’ faces with their facial photos is required for product traceability.

  • Step 2: Large-scale tests for identity verification of workers

    NeoFace achieved the lowest FRR of 0.3% in processing the passport/visa photo image database at a FAR of 0.1% for the NIST personal identity searches [16]. However, FRR and FAR should be evaluated in workplace environments because they are completely different from those of NIST searches. It is also necessary to evaluate the total time necessary for worker verification because supervisors should visually check workers’ names and their facial photos when the workers selected their names for themselves.

  • Step 3: Large-scale tests for comparing our method with a conventional method

    The proposed method should be compared with a conventional verification method regarding workers identity and operation-record visualization with AI-forms, as shown in Fig. 2. Specifically, we measure productivity change in the same way as conducted in a telecom equipment factory [3]. We will investigate the burdens on workers and supervisors where security and service quality assurance must be ensured as well as for preventing fraudulent acts.

4 Tests

4.1 Identity Verification and Visual Confirmability

We are currently conducting the first step of our evaluation plan. The accuracy of the recognition results must be verified before the proposed method can be introduced to actual workplaces. The control parameters for identity verification include internal and external parameters related to face recognition and operating parameters [17]. The internal parameters are physical properties of the face and are independent of the observer. Examples include caps, facial expressions, hair, eyeglasses, and makeup. The external parameters relate to how the face is seen and the situation, including face direction, lighting, background, and image resolution. The operating parameters relate to the operations performed for face recognition in the workplace. Examples include whether the camera is installed in a fixed location, whether the worker performs the operation manually, whether the worker is stationary or moving, and how many times face recognition is carried out.

Our method will be used at a factory workplace where AI-forms has been introduced. Workers may be required to wear caps, glasses, or face masks in the workplace, so whether such items are being worn is considered an internal parameter. External parameters, such as those related to changes in illumination and background, may have practical importance [19]. If the level of illumination is low, face detection may not be possible, and the faces of other workers that appear in the background may be detected [19]. Because the workplace that we considered for this study is indoors, the lighting is fully adequate for face recognition. Although there are various types of equipment and facilities at the site, a workspace for each worker is ensured, so the possibility of other workers’ faces appearing in the recognition image is low. Therefore, deviation in a worker’s face direction to the left or right or up or down from looking straight at the camera is considered an external parameter. Concerning the operating parameters, the worker remains stationary facing a smartphone camera, and the face-recognition process is carried out once. Facial images acquired for face recognition are confirmed whether they are useful for a supervisor or other responsible individuals who visually check workers’ photos and names.

4.2 Test Method

Preliminary testing was conducted with the proposed identity-verification method integrated with AI-forms at the NEC Platforms Fukushima Plant. In this workplace, tasks are performed by up to ten qualified workers. For testing, facial images were registered for 11 workers as qualified workers and the method was evaluated for accurate recognition of those workers. The workers first confirmed that the images of their faces appeared on the screen of a smartphone placed within their reach beside their work tables. The workers then tapped a button on the screen to photograph their faces, and face recognition was carried out.

First, 44 facial images were acquired by photographing 11 workers under four conditions: bare face (wearing no cap, glasses, or face mask), wearing a cap, wearing glasses, and wearing a face mask. Workers may be required to wear both a cap and glasses in the workplace, so an additional 11 images in which the 11 workers were wearing both cap and glasses were acquired, making a total of 55 images for the recognition testing. In some workplaces, it may be difficult to acquire images in which the worker is facing straight toward the camera. For example, it may not be possible to position the smartphone directly in front of a worker, so the camera may be offset upward or downward or to the left or right to some extent. We therefore acquired an additional 44 images in which the faces of the 11 workers were photographed from four directions at about a 45° angle (within the range in which both eyes are included) to serve as non-frontal evaluation images. The grand total of images used in the evaluation was therefore 99.

4.3 Test Results

  1. (1)

    Efficiency of identity verification

    Beginning from when the worker opened the app, face recognition was completed within 0.4 s for each of the 99 evaluation images. When recognition was successful, the worker was able to use AI-forms with speech interaction seamlessly. When recognition failed, it was possible to use AI-forms by selecting the worker’s name from a list in a conventional manner. Regardless of the recognition result, the facial image acquired during identity verification was recorded correctly together with the name in AI-forms.

  2. (2)

    Accuracy of identity verification

    The face-recognition accuracies for the 55 images in which the workers were wearing certain items are presented in Table 2. There were no cases of recognition failure due to insufficient illumination or faces of other workers appearing in the background. For recognition using front-facing images, identity was verified for all workers except for the images in which the worker was wearing a face mask; thus, recognition was not possible when a face mask was being worn because the face-recognition software uses features associated with the mouth. Because a worker begins a task using AI-forms after identity verification, we plan to address this problem with a work rule, such as face masks should be put on after identity verification. The recognition accuracies for the 44 non-frontal images are presented in Table 3. All workers were identified from the images of upward and downward face directions. The recognition accuracy was 18% for leftward and 27% for rightward. For vertical changes in facial direction, facial symmetry was maintained, but the loss of left-right symmetry with changes in the horizontal direction may have affected the recognition results. We consider two solutions to address the low recognition accuracy regarding deviations from the frontal direction when the face is photographed from the left or right. The first is addressing the camera operation by considering installation positions where it is not a burden on the worker to turn to face the camera directly. In cases in which it is not possible to properly position the camera and non-frontal facial images cannot be avoided, identity verification is carried out with additional non-frontal registered photos. Multiple registered photos including left and right deviations can improve the recognition accuracy of identity verification.

    Table 2. Accuracy of faces with cap, glasses, and face mask [%]
    Table 3. Accuracy of non-frontal faces [%]
  3. (3)

    Visual confirmability

    Images acquired for identity verification were recorded in AI-forms together with worker names, including cases in which recognition failed. It was confirmed that all acquired images were recorded when discrimination by human visual examination was possible. However, visual confirmation could not be easy when a worker wear a face mask. This problem could be addressed with a work rule, such as face masks should be put on after identity verification.

5 Discussion

These preliminary tests showed the initial feasibility of our identity-verification method using frontal facial images of workers who are not wearing face masks in the factory workplace where use of the method is planned. The following improvement issues were clarified through discussion with workers and supervisors after the tests:

  1. (1)

    Closer integration of AI-forms and identity-verification

    There was a request from workers to reduce as much movement regarding the line of sight from the work location as possible for the identity-verification process during task handover or restarting a work task. Thus, the objective is to enable continuous concentration on a task in a nearly eyes-free state. To achieve this objective, it is necessary to consider placement of the camera used for identity verification in the same line of sight as the task. This will be taken into considering during the large-scale tests of the second step of our evaluation plan.

  2. (2)

    Identity-verification at arbitrary times

    There was a request from supervisors that the identity-verification process be carried out at arbitrary times in the work process and without workers being aware. Thus, the objectives are thoroughness in preventing impersonation and not hindering concentration on the task. Currently, facial images for identification are acquired at the beginning of a task and during work handover or restarting, but we are considering arbitrarily acquiring photographs when a task is being carried out and recording them together with the identification results. When workers are not aware of being photographed, they may have their eyes closed. Closed eyes cause face recognition to fail [18]. To avoid such failure, the face-recognition app will be improved to take two photos of workers after an interval of about 0.5 s to obtain facial photos with their eyes open. Few people spontaneously keep their eyes closed longer than 0.5 s because human blink duration is on average between 0.1 and 0.4 s [20]. Few people spontaneously blink twice in 0.5 s because human blink rate is between 7 and 17 per minute [21]. Either photo can be properly collated with their registered photos. The improvement will be taken into consideration during the large-scale tests of the second step of our evaluation plan.

  3. (3)

    Preventing impersonation

    Supervisors require thoroughness in preventing impersonation. Our proposed method principally trusts workers to self-assert as the correct worker if the smartphone app does not recognize them properly though a supervisor visually verifies workers’ names and facial photos. For verification, the most important defense against this is when an adversary inserts an image of someone who is not one of the qualified workers, he/she is identified as a non-match.

6 Conclusion

We proposed an identity-verification method for applying face recognition to AI-forms and developed a smartphone app for AI-forms. Preliminary feasibility testing involving 11 workers in an actual workplace confirmed that identity verification is possible when face recognition is carried out with frontal images of workers who are not wearing face masks. The face-recognition process completed within 0.4 s, enabling workers to seamlessly begin work with AI-forms. Recording both collation photos and worker names during identity verification also made it possible for a human to visually confirm a worker’s identity. Discussion with workers and supervisors after the feasibility tests provided findings for improving our face-recognition app and evaluation plan. For future work, we will improve our face-recognition app for closer integration of AI-forms and our identity-verification method at arbitrary times.