Serverless Urban Farming Instructional Text Named Entity Recognition

Main Article Content

Trisna Gelar
Aprianti Nanda
Akhmad Bakhrun


The evolution of document documentation, classification, and information retrieval includes named entity recognition (NER). The implementation of NER in the agricultural domain, in particular instructional texts or transcriptions of tutorial videos, will make it easier for the general public to understand the specific concepts and terms of urban agricultural activities such as crop production processes and procedures, agricultural methods and tools, harvest cycles, and handling plant pests or diseases. Spacy is an NLP tool, has two methods of developing NER models, namely with Toc2Vec and Transformer. Both methods have advantages and disadvantages, namely different sizes, performance and prediction speeds according to needs. The NER model can be implemented into a Serverless application, using the Functional as Services (FaaS) and Backend as Services (BaaS) approaches. For the subtopic of cultivating fruit crops in agricultural instructional literature, three NER models have been built in this study. First, the IndoBERT-based model, the Toc2Vec-based model with efficiency optimization, and the Toc2Vec-based model with accuracy optimization. The most efficient toc2vec model, with a f1-score of 0.71, is followed by the effective toc2vec model, with a f1-score of 0.60. The COUNT, PERIOD, and VERIETAS entities are consistently predicted incorrectly by the Toc2Vec model, which is unable to forecast numeric entities well. In addition, the Toc2Vec Model's better efficiency optimization directly relates the size of the model to the speed of word prediction per second, and the model is simple to integrate into a FaaS- and BaaS-based Serverless. The capabilities of Serverless M have been successfully tested using the black box method.


Download data is not yet available.

Article Details

How to Cite
T. Gelar, A. Nanda, and A. Bakhrun, “Serverless Urban Farming Instructional Text Named Entity Recognition”, JuTISI, vol. 8, no. 3, pp. 597 –, Dec. 2022.