Skip to content
#

nerual-network

Here are 224 public repositories matching this topic...

Tokenization is a way of separating a piece of text into smaller units called tokens. Here, tokens can be either words, characters, or subwords. Hence, tokenization can be broadly classified into 3 types – word, character, and subword (n-gram characters) tokenization.

  • Updated Jun 30, 2021
  • Jupyter Notebook

Autonomous vehicles offer a promising solution to safer driving through monitoring and utilizing real-time sensing data. Information collected from different sensors, including LiDAR's, ultrasonic sensors, cameras, radars, and GPS, can be processed, and analyzed to operate the vehicle. Computer vision played an essential role in advancing this t…

  • Updated May 10, 2022
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the nerual-network topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the nerual-network topic, visit your repo's landing page and select "manage topics."

Learn more