Skip to content
View moon23k's full-sized avatar
🔥
Focusing
🔥
Focusing
Block or Report

Block or report moon23k

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
moon23k/README.md

✋ Hello  World

  Hi there! I'm moon, someone dedicated to tackling challenges through artificial intelligence. Among the various AI subcategories, Natural Language Processing intrigues me the most. By profession, I'm a NLP Machine Learning Engineer. My goal as an engineer is to create models that can engage in natural communication with people. The code in all my repositories reflects my progress toward achieving this objective. In addition to the codes in my git repos, I've provided comprehensive project summaries on my portfolio website. I encourage you to take a moment to explore and discover more about each project. If you'd like to get in touch, feel free to reach out to me via the email address listed in the left information field.



🤖  Model  Architecture

  The model architecture plays a pivotal role in machine learning engineering, as it can greatly influence performance. Below, you'll find a collection of projects focused on exploring and establishing standards for appropriate model structures in three NLG tasks: Translation, Dialogue Generation, and Summarization.

•  Transformer Balance       •  Transformer Variants       •  Transformer Fusion



🏃‍♂️  Training  Strategy

  Alongside model architecture, another crucial factor influencing the performance of deep learning models is the training strategy. In order to explore more advanced training methodologies, various approaches such as pretraining, finetuning, GANs, etc., are being applied and developed, alongside diverse research endeavors including deep studies aimed at enhancing efficiency.

•  Customized  Pretraining       •  Generation  Improveing  Fine-Tuning       •  IntelliGEN



🎯  Task Specific  Experiment

  Machine translation is the task of converting Text from Source Language into Target Language using a computer processing. The hegemony of machine translation was Rule-Based at the earliest, followed by SMT, and now NMT has been established. NMT aims to derive more accurate and natural translation results using Neural Networks. Below are experiments of various Neural Network Architectures for this purpose.

•  Multi-Lingual Translation       •  Multi-Turn Dialogue Generation       •  Efficient Text Summarization



🏗️  LLM  Framework

  The Large Language Model (LLM) is currently achieving remarkable results across various fields. However, to fully leverage the outstanding performance of LLM, it is crucial to consider appropriate methodologies tailored to user needs. Even with the basic LLM, significant results can be obtained, but better outcomes can be achieved through slight improvements and additional approaches. In the following projects, methodologies for effectively leveraging LLM according to user needs will be explored, along with the proposal of a practical framework applicable to real-world services.

•  Context-Aware Translation Framework     •  Characterstic Conv Framework     •  Trustworthy Conv Framework



💾  Dataset

  Machine translation is the task of converting Text from Source Language into Target Language using a computer processing. The hegemony of machine translation was Rule-Based at the earliest, followed by SMT, and now NMT has been established. NMT aims to derive more accurate and natural translation results using Neural Networks. Below are experiments of various Neural Network Architectures for this purpose.

•  NLP Datasets         •  Tokenizers         •  Back Translation         •  SemEnt


Pinned

  1. Transformer_Variants Transformer_Variants Public

    Transformer Architectures Comparison in Natural Language Generation Tasks

    Python 2

  2. Transformer_Fusion Transformer_Fusion Public

    This repo covers methodologies to utilize Pre Trained BERT model on NMT Task

    Python

  3. GIFT GIFT Public

    Python

  4. LEFT LEFT Public

    Python

  5. Context_Framework Context_Framework Public

    Python

  6. Character_Framework Character_Framework Public