The recent developments in Artificial Intelligence have significantly improved the capabilities of intelligent recommendation systems. However, current culinary assistants using static ingredient filtering or collaborative filtering do not possess emotional and contextual awareness. In this paper, we introduce ChefMoji, an emotionally intelligent AI-based culinary assistant that incorporates facial emotion recognition, generative large language models, and contextual awareness within a single framework. In this context, we propose a hybrid facial emotion recognition approach that can detect the emotions of the end-users and dynamically adjust the recipe suggestions based on the end-users' moods. We use prompt engineering-based generative large language models to generate structured recipe outputs with ingredients, preparation steps, calorie count, and pairing suggestions. Moreover, we incorporate contextual factors such as weather and calendar events, as well as persistent storage in the pantries, to make our proposed model more practical and applicable in real-life scenarios. This is further supported by experimental observations, which indicate that the proposed model can attain higher personalization relevance, contextual flexibility, and end-user-centricity in terms of recommendation quality than existing AI-based culinary assistants. With this, we can say that the proposed ChefMoji can play an important role in developing future intelligent culinary assistants.
Artificial Intelligence recent rapid rate of development has greatly affected the innovation of intelligent recommendation systems for various fields such as health care, entertainment, education, and lifestyle management. More recently, AI-based culinary recommendation systems are assisting users with meal planning and recipe selection by providing meaningful assistance. Existing systems used to recommend recipes predominantly utilize static parameters (i.e., cuisine type, preparation time, dietary preference, or popularity) that do not consider facial expressions (and thereby represent a person’s mood at that time) as well as contextual information concurrently with those parameters when recommending recipes.
For example, users typically demonstrate various facial expressions when selecting food to eat, such as cheerful expressions indicating an acceptance of a new or celebratory type of dish, while tired or sad expressions indicate a preference for easy, quick meals; however, the majority of current recipe applications do not utilize real-time visual indicators to provide personalized recommendations to their users. Therefore, the results generated from many existing systems are not based upon real-time situational conditions (i.e., user experience, facial expression, and/or visible mood/state).
To overcome this shortcoming, this paper proposes ChefMoji, a facial expression-based and contextually aware AI-assisted cooking assistant that aims to improve the personalization aspect of recipe suggestion. The proposed system utilizes Facial Expression Recognition (FER) methods to interpret the facial image of a user and identify visible expressions like happy, sad, angry, surprised, and neutral. These identified expressions are used as mood indicators, which are facial cue-based and not emotionally or psychologically profound.
ChefMoji does more than just recognize expressions. It also thinks about the weather, the time of day what you have in your pantry and what you like to eat. It uses a kind of computer program to come up with recipe ideas that are just right for you. This program looks at all the information ChefMoji has about you and your kitchen. Then it gives you a list of recipes that you can make with what you have.
ChefMoji also makes it easy for you to follow the recipes. It tells you what to do in a voice so you do not have to read anything. It shows you the recipes, on a website that is easy to use. The system even knows what you have in your pantry and will suggest recipes that use those things. This makes it very easy to cook with ChefMoji.
The significance of this research is in proving the potential of facial expression recognition and contextual intelligence integration for enhancing the relevance and effectiveness of recipe recommendation systems powered by AI. ChefMoji’s focus on overt facial expressions as opposed to emotional analysis makes it a light, interpretable, and user-friendly solution for a personalized cooking assistant. This study shows the potential of facial recognition and generative AI for developing human-centric and engaging cooking assistant systems.
Recent developments in Artificial Intelligence (AI), affective computing, and recommender systems have greatly impacted the creation of intelligent culinary assistants. This section discusses previous research works done on three major domains related to ChefMoji: Facial Emotion Recognition (FER), AI-based recipe generation.
Facial Emotion Recognition (FER) has been researched heavily in areas of computer vision and affective computing. One of the first surveys of the FER literature came from I. M. Revina and W. S. Emmanuel [1] as they examined several traditional categories of FER techniques, such as geometric extraction and appearance-based approaches for feature extraction.
Deep learning methods subsequently yielded far better performance for classifying emotions than traditional techniques. Shu Li and Weihong Deng [2] conducted a survey of CNN-based FER systems and showed their superiority compared to handcrafted features through several different methods of classification. Similarly, Jyoti Kumari et al. [7] conducted an analysis of various model types used to classify the five basic emotions that all humans experience - happiness, sadness, anger, fear and surprise.
Emotion-aware systems have become increasingly relevant to adaptive human-computer interaction. N. Agrawal et al. [8] demonstrated mood detection using FER Pipelines while J. X. Y. Lek and J. Teo [9] explored a number of applications of emotion classification in the academic setting. All these works illustrate how the field of affective computing enhances the ability to provide personalized service, and this is in keeping with one of the core principles of ChefMoji.
However, with the development of generative models, AI-assisted recipe generation has attracted the attention of researchers. D. Patel and K. Patel [4] presented an intelligent culinary assistant using the GPT-2 model, which can generate recipes using user inputs. The authors proved the effectiveness of structured recipe generation using transformer models.
Similarly, S. Navaneeth et al. [3] proposed a machine learning-based intelligent recipe generator that generates personalized recipes using user constraints. Another study by the authors, titled "ChefAI: Generating Indian Cuisine Recipes Using AI Algorithms," focuses on the generation of Indian cuisine using AI algorithms by S. Chaudhary et al. [5].
Recently, multimodal models have been proposed to extend the domain of the intelligent recipe generator. B. Hannon et al. [15] proposed Chef DALL·E, which generates cooking experiences using multimodal models, including text and image models. Another study by S. Aliyan and A. P. D. F. Gongor titled "Intelligent Pantry System Based on Chatbots to Generate Recipes" explores the intelligent pantry system using the chatbot model to generate recipes autonomously.
Collectively, the studies discussed above have established the foundation for the development of generative AI-powered intelligent recipe generators, which ChefMoji extends to the domain of emotional intelligence.
Besides generation, the key aspects of a recommendation model are ranking and personalization. Y. Tian et al. [13] introduced RecipeRec, a heterogeneous graph learning model that considers ingredient relationships and user preferences. The use of a graph that has lots of things in it made the recommendations more accurate. It did this by thinking about the complicated relationships between people who use the system and the recipes.
People usually use two methods to filter recommendations in systems that suggest food. These methods are looking at what's, in the food and making sure it meets certain rules. This is especially important when dealing with food allergies and diets. The way ChefMoji ranks recipes is based on these methods.
Nowadays, AI culinary assistants are based on large-scale API-driven architectures. H. Habib in [12] offered implementation strategies to develop intelligent conversational applications using the OpenAI API. These best practices were followed to design the modular and fallback-based architecture of ChefMoji.
Although existing literature has shown good advancements in Facial Emotion Recognition, AI-Based Recipe Generation, Nutrition-Aware Recommender Systems, Smart Pantry Management, most of them work in isolation in their respective domains.
However, there is less work that incorporates emotion detection, contextual awareness (including weather and calendar events), AI-based dynamic generation, smart pantry persistence, and conversational interaction in a single culinary tool. ChefMoji aims to fill this research void by integrating affective computing, AI-based generation, and nutrition-aware recommendation in a single emotionally intelligent cooking tool.
Table 1: Salient Findings of various authors for Facial Expression and Recipe Generation
|
S.No. |
Author Name |
Topic |
Methodology |
Accuracy |
Findings |
Research Gap |
|
1 |
I. Michael Revina & W.R.S. Emmanuel (2018) |
Survey on Human Face Expression Recognition (FER) Techniques |
Survey of image-based FER systems based on Preprocessing, Feature Extraction, and Classification. Compared techniques like Gabor, LBP, HOG with classifiers such as SVM, CNN using datasets like JAFFE & CK. |
75% – 99% (highest with ROI + Gabor + SVM) |
SVM gives highest accuracy; Gabor features effective; JAFFE & CK mostly used. |
Illumination & pose variation issues; need better real-time and side-view FER systems. |
|
2 |
Shan Li & Weihong Deng (2018) |
Deep Learning based Facial Expression Recognition (FER) Survey |
Reviewed deep FER systems based on Pre-processing, Deep Feature Learning (CNN, DBN, RNN, GAN), and Classification using datasets like CK+, FER2013, JAFFE, RAF-DB, AffectNet. |
Reported accuracy in reviewed works: 70% – 99% (highest on CK+ with deep CNN models). |
1. Deep CNN gives better performance than traditional methods. |
1. Overfitting due to small datasets. |
|
3 |
S. Navaneeth, V.R. Pragathi, M. Deepak, R. Arun Kumar (2025) |
AI-Based Smart Recipe Generator using ML |
Used EfficientNetV2B0 (CNN) for ingredient recognition (51 classes, 4501 images), Gemini API for recipe generation, and Stable Diffusion (Hugging Face) for recipe image generation. |
96.87% classification accuracy (97.26% validation accuracy) |
1. High accuracy in ingredient detection. |
1. Limited to 51 ingredient classes. |
|
4 |
Darshan Patel & Kavan Patel (2025) |
AI-Based Recipe Generator using GPT-2 & Multi-Modal AI |
Fine-tuned GPT-2 (124M) for recipe generation, ML classifiers (SVM, RF, NB) for cuisine & meal prediction, and two-tier nutrition engine (Local DB + USDA API). |
85% recipe coherence; 72.9% cuisine accuracy; 64% meal-type accuracy |
1. GPT-2 generates practical recipes under 3 sec. |
1. Meal-type accuracy moderate. |
|
5 |
Smriti Chaudhary, Archi Mamaniya, Binita Soni, Ashwini Dalvi, Aditya Sindhavad, Irfan Siddavatam (2022) |
AI-based Indian Recipe Generator (ChefAI.IN) |
Used AutoChef evolutionary algorithm (Genetic Algorithm) for generating new Indian recipes. Dataset: ~6000 Indian recipes. Applied data cleaning, preprocessing, JSON formatting, mutation & similarity-based generation. Web app developed for output. |
Survey-based evaluation: 76.5% found instructions understandable, 70.6% found actions suitable, ~86% willing to cook generated recipe |
1. Successfully generated unique Indian cuisine recipes. |
1. Instruction clarity can improve (timing, temperature missing). |
|
6 |
Jyoti Kumari, R.Rajesha, KM.Pooja |
Facial expression recognition: A survey |
Compared LBP, LGC, LDP, HOG on JAFFE dataset using KNN |
Highest: 92.30% (LGC-VD) |
LGC-VD best; LBP & LGC strong; LDP weakest |
Needs better accuracy, occlusion handling & real-time/dynamic FER |
|
7 |
S. Happy, Aurobinda Routray |
Facial Expression Recognition using Local Binary Patterns |
Used LBP for feature extraction; SVM classifier; evaluated on CK+ dataset |
94.09% (CK+) |
LBP effective for facial emotion recognition; good performance on posed expressions |
Limited performance on spontaneous expressions; needs improvement for real-time & varied lighting conditions |
|
8 |
Kyriakos Kalpakoglou, Lorena Calderón‑Pérez, Metin Guldas, Çağla Erdoğan Demir, Gymnopoulos, Kosmas Dimitropoulos (2025) |
AI-based nutrition recommendation system with Mediterranean cuisine insights |
Developed an AI system to generate personalized weekly meal plans using an expert-validated Mediterranean food database; evaluated on 4,000 simulated user profiles |
Evaluated on: • Dish/meal filtering accuracy • Meal diversity/balance • Calorie & macronutrient recommendation accuracy |
AI system was effective in generating balanced personalized nutrition plans tailored to individual preferences/needs |
Needs real-world user testing and clinical validation for effectiveness beyond simulation |
|
9 |
A. Karunamurthy, A. Sulaiha, Y. Yashitha Begam (2024) |
AI-Driven Recipe Generating Chatbot for Personalized Culinary Recommendations |
Developed AI chatbot using Python, Flask, NLP (including Named Entity Recognition), rule-based algorithms, and Term Frequency Distribution for personalized recipe suggestions |
Not explicitly numerical — chatbot successfully retrieves tailored recipes using user inputs (cuisine, diet, course type) and structured dataset |
NLP + rule-based approach enables dynamic, accurate recipe recommendations with ingredient lists & cooking steps |
Needs real-world user evaluations & comparison with other AI recommendation models |
|
10 |
Divya Tanwar, Tabashir Nobari, Pia Chaparro, Anand Panangadan |
Quantitative Evaluation of AI-Generated Recipes for Health Recommender Systems |
Used GPT-4 via OpenAI API to generate recipes; compared with Allrecipes & Mayo Clinic datasets using NER + Cosine Similarity; validated calories using USDA FoodData Central API |
Ingredients match: 67–88% (avg 77.1%); Directions: 64–86% (avg 74.3%); Diabetic recipe match: 26–83%; Calorie error (MAPE): 14% |
GPT-4 can generate fairly accurate recipes; calorie estimation relatively reliable; diabetic recipe ingredient match varies widely |
Limited sample size (14 recipes); equal weighting of ingredients; needs better prompt design & broader nutrient validation |
The methodology of the ChefMoji system focuses on developing an AI-powered cooking assistant that integrates emotion recognition, voice interaction, and intelligent recipe generation. The system is designed using a modular approach where different technologies work together to provide personalized cooking recommendations to the user. The development process includes emotion detection, speech processing, AI-based recipe generation, and system integration.
|
View/Download Recipe Card: · View Recipe History · Download Recipe as PNG Card |
|
ChefMoji: |
|
Start |
|
Capture/User Upload Image |
|
Emotion Detection Module: · FER – Based · OpenAI - Based |
|
Optional: Google Calendar API or Local Events File |
|
Weather & Events Module: · Fetch Weather · Fetch Upcoming Events |
|
Pantry Database |
|
Generate Personalised Recipes: · Rank Recipes Based on Pantry Ingredients |
|
Pantry Management Module: · Manage Pantry Items · Retrieve Current Pantry State |
|
Recipe Ranking Module · Rank Recipe Based on Pantry Ingredients |
|
End |
|
Display Recipes: • Show Top Recipes • Offer Text-To-Speech Option |
|
Text-To-Speech(TTS) Module: · Read Recipe Aloud · Voice Command to Stop TTS |
|
Detect User’s Mood |
Figure1: Conceptual Framework (Methodology)
ChefMoji is an AI-assisted cooking platform that uses emotion recognition technology, automatic recipe generation, and nutrition checks to provide users with a complete cooking experience.
The system builds on previous research in areas such as facial recognition of emotions [1][2], recipe generation using artificial intelligence [3][4], and chat-based cooking applications. As opposed to traditional recommendation systems, ChefMoji employs emotion-based computing, considers other factors (such as weather and time) when recommending recipes, and maintains a history of past pantry items to give users personalized cooking experiences.
ChefMoji uses a hybrid approach to provide real-time recognition of facial emotions. Primary way to identify emotional expressions is through Local FER-based identification. Fallback (if Local FER-based identification does not meet the requirements) uses OpenAI multimodal API for identification based on the images provided by the user. FER techniques can be found within established research in Surveys [1][7] and in deep-learning models [2] utilized to classify emotions independently. The method classifies identified emotions into predefined mood groups: Happy, Sad, Angry, Neutra, Surprised, Fearful, Nostalgic Interactive and Emotion-aware systems provide a significant improvement to personalize users within any interaction [8][9].
ChefMoji increases the quality of recommendations by considering multiple conditions of context: time of day, weather (e.g. through the OpenWeather API), upcoming events on the calendar. Context-aware food recommendation systems have been shown to increase relevance when environmental factors are included [10]. The system detects impending events via Google Calendar integration or structured JSON inputs and modifies its recommendations accordingly.
By using OpenAI APIs Large Language Model as the main recipe engine I get structured recipe details.
These details include: Cuisine, Description, Ingredients, Directions, Estimated calories, Nutritional breakdown. The OpenAI Large Language Model helps create recipe objects with all this information. It makes it easy to get recipe details like title, cuisine and description. I can also see the ingredients and directions. I can also see the calories and nutritional information. Some recipes even have a note and pairing suggestion. The OpenAI APIs Large Language Model is really helpful, for generating these recipe objects. It provides all the information in a structured way.
These recipes draw inspiration from both GPT-based culinary assistants [4][11] and from newer AI-powered smart recipe generators [3]. The implementation of prompt engineering techniques follows the best practices used in API based generative systems [12].
In cases where access to the API may not be possible, such as an outage, ChefMoji has a back-up system that will create deterministic mood-based recipes without any disruptions to the system.
A new pantry-based memory system has been added to ChefMoji using SQLite to store ingredients.
Recipes are scored using a matching algorithm based on the following: Count of matched ingredients and Count of missing ingredients. Recipes are ordered according to their maximum pantry match and minimum amount of missing ingredients. The recommendation strategies used to recommend recipes such as RecipeRec [13] use graph and content approaches at the ingredient level as a basis for the algorithm used by ChefMoji to determine recipe matches.
Each recipe contains calorie metadata and nutrition fields. While AI-generated, nutritional logic follows the structured evaluation framework used in health recommender systems [10] and quantitative validation of AI-generated recipes [14]. Possible future extensions include the integration of nutrient validation APIs, similar to those used in studies on clinical nutrition recommender systems [10].
ChefMoji incorporates: Text to speech (TTS) functionality through pyttsx3, Detecting when the voice has stopped speaking via speech recognition. Predefined conversational personality templates that are aligned with the intended emotion of the user. Smart kitchen systems driven by chatbots and conversational culinary assistants have shown an increase in engagement and accessibility [6][15].
Recipes are stored, as are ingredient mappings, mood context, ratings, weather & event metadata. Basic analytics modules include: Most frequent mood, Most used ingredients. Data persistence can be used for user-adaptive learning, as seen in intelligent recommender systems [13].
The proposed ChefMoji system was assessed for its reliability in emotion detection, quality of AI-generated recipes, effectiveness of the pantry-based recommendation approach, and contextual-based personalization. The hybrid approach of the proposed Facial Emotion Recognition (FER) mechanism, where local and API-based detection were employed, showed promising performance under different lighting conditions and facial orientations. This is supported by the improved robustness of deep learning-based FER systems over traditional approaches, as highlighted in [2]. The incorporation of emotion mapping for culinary preferences showed promising results for mood-based recipes, as supported by previous research on the impact of affect-based systems on user engagement and personalization [8][9].
The output from the Large Language Model (LLM) based recipe generation module always followed a logical and structured pattern. The semantic consistency and logical structuring of the output can be attributed to the capabilities of AI-based culinary assistants as reported in previous studies [3][4]. The novelty of the proposed system lies in the incorporation of the emotional state, weather conditions, and calendar events into the prompt structure, as opposed to traditional recipe generation systems that consider only ingredient constraints. The qualitative analysis of the output, based on the inspiration from AI-generated recipe system evaluation [14] shows the high compatibility of the ingredients and logical flow of the procedures.
The pantry-aware ranking mechanism greatly enhanced the practicality of the system by prioritizing the recipes that had higher ingredient matches and fewer ingredients to be purchased. This method is conceptually similar to other ingredient-based recommendation methods such as RecipeRec [13], but with the enhancements of persistence and adaptability.
Moreover, contextual integration with respect to the weather and event detection also enhanced the intelligence of the recommendations. For example, the cold weather conditions led to the recommendation of comfort food for meals, while festive events on the calendar led to the recommendation of celebratory food. Such multi-factor adaptation is also similar to the findings of the health-aware and context-driven recommendation systems [10]. The multi-factor integration of the recommendations, as proposed by the hybrid model, also outperforms the recommendations proposed by the individual AI-based culinary assistants [3][6].
The overall findings of the experiments demonstrate the effectiveness of the proposed hybrid model as a solution for integrating emotion awareness with AI-based culinary intelligence.
ChefMoji is a new type of artificial intelligence-based kitchen assistant (culinary assistant) that incorporates multiple technologies (e.g., emotion recognition via facial expression) to create recipes, recommend pans based on available ingredients (also called pantry-aware recommendations), and personalize the culinary experience (also known as contextual personalization). Unlike other systems that use basic recipe recommendation methods (e.g., filtering, immediate neighbor), ChefMoji uses the affective computing process to optimize personalization.
The target system is based on well-established (but still evolving) emotion recognition methods [1][2] and adaptive emotion-aware systems [8][9] in that it continuously aligns suggestions for recipes with detected moods of users. By using well-established structured recipe generation models [3][4] for generating structured recipes, the culinary assistant enhances previous AI-based culinary assistants by being able to generate output that is both sensitive to context and can drive conversation. In addition, the implementation of the pantry-aware ranking method used with the system can minimize the frustration of running out of ingredients by using more effective ingredient recommendation strategies like RecipeRec [13] for ingredient level recommendations.
Applying unique contextual, or situational, attributes (e.g., weather & calendar/seasonal events) plays a major role in increasing the personalization of recommendation’s value proposition by providing the foundational components needed for Developing ~ Health/Aware & Context/Aware Recommendation Systems as described in [10]. While there are consistent structures for the nutritional data generated from AI systems, future clinical validation (analogous to the quantitative evaluation framework described in [14]) will increase the precision of this area of work.
In summary; The integration of Affective intelligence, Generative AI, Contextual Adjustment, & Persistent Memory mechanisms developed in ChefMoji demonstrate how significant improvements were made in the function and engagement with intelligent culinary systems through the four technologies used together. The proposed model contributes toward an Innovative New Model that will advance the Development of smart kitchen Technologies as a holistic, Emotion Centred & Contextually Based Model for a Culinary Assistant.
Even though ChefMoji is able to effectively combine emotion detection, generative AI, and context-based personalization, several changes can still be made to improve the system.
One area that can be explored in the future is the quantitative benchmarking of the accuracy of the emotion detection system, as well as the quality of the recipes generated. Advanced deep learning-based facial emotion detection models can also be incorporated into the system to improve the accuracy of the system under adverse conditions such as low lighting, occlusion, and varying facial orientations.
The nutritional value of the recipes generated can also be made better through the incorporation of clinical nutrition information. Scientific evaluation of the accuracy of the recipes generated can also be achieved through the use of metrics such as Cosine Similarity and Mean Absolute Percentage Error.
In the future, multimodal AI features can also be incorporated, such as the identification of the ingredients through their images, which can then be uploaded for the generation of recipes.
Furthermore, long-term user interaction can also be utilized for the enhancement of the quality of recommendations through the use of reinforcement learning and graph-based recommendation systems.
Lastly, the integration of IoT-enabled smart kitchen appliances can also transform ChefMoji into an automated smart kitchen assistant.
References