Get Mystery Box with random crypto!

Neural Networks Engineering

Logo of telegram channel neural_network_engineering — Neural Networks Engineering N
Logo of telegram channel neural_network_engineering — Neural Networks Engineering
Channel address: @neural_network_engineering
Categories: Technologies
Language: English
Subscribers: 2.35K
Description from channel

Authored channel about neural networks development and machine learning mastering. Experiments, tool reviews, personal researches.
#deep_learning
#NLP
Author @generall93

Ratings & Reviews

3.00

2 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

1

4 stars

0

3 stars

0

2 stars

0

1 stars

1


The latest Messages

2022-06-29 14:46:24
One of the main features of the framework is caching. It allows you to infer large models only once and then use cached vectors during the training. It speeds up the process x100 times, simultaneously allowing you to use batch sizes that are unattainable in other ways.

(gif)
1.3K viewsAndrey, edited  11:46
Open / Comment
2022-06-29 14:45:36 Similarity Learning lacks a framework. So we built one.

Many general-purpose frameworks allow you to train Computer Vision or NLP tasks quickly. However, Similarity Learning has peculiarities, which usually require an additional layer of complexity on top of the usual pipelines.
So, for example, the batch size in the training of similarity models has a much greater role than in other models. Labels either do not exist or are handled in a completely different way. In many cases, the model is already pre-trained, which also adjusts the process.

Developing Similarity Learning models one after another, we began to notice patterns that helped us generalize and bring all our experience with training and fine-tuning such models into one package.
Yesterday we published Quaterion — an open-source, blazing-fast, customizable, scalable framework for training and fine-tuning similarity learning models.
1.3K viewsAndrey, 11:45
Open / Comment
2022-05-04 15:40:30 Metric Learning for Anomaly Detection

Anomaly detection is one of those tasks to which it is challenging to apply classical ML methods directly.
The balancing of normal and abnormal examples and the internal inconsistency of anomalies make classifier training a challenging task.

And the difficulty is often related to data labeling, which in the case of anomalies may not be trivial.

The metric learning approach avoids the explicit separation into classes while combining the advantage of modeling the subject domain with the knowledge of specific anomalous examples.

In our case study, we are solving the problem of estimating the quality of coffee beans and determining the type of defects.

We trained Auto-Encoder on unlabeled samples and made fine-tuning on a small fraction of labeled ones.
This approach achieves results equivalent to conventional classification but requires orders of magnitude less labeled data.
386 viewsnne_controll_bot, 12:40
Open / Comment
2022-03-25 15:24:04 ​​Triplet loss - Advanced Intro

Loss functions in metric learning are all chasing the same goal - to make positive pairs closer and negative further.
But the way they achieve this leads to different results and different side effects.

In today's post, we describe the differences between Triplet and Contrastive loss, why the use of Triplet loss can give an advantage, especially in the context of fine-tuning.
It also covers the approach to an efficient implementation of batch-all triplet mining.
861 viewsnne_controll_bot, 12:24
Open / Comment
2022-03-01 10:00:29 Against Putin

Hi everyone, today's post is not about machine learning.

Everyone may already know that Putin unleashed the war in Ukraine.
Thousands of people are dying under his tanks because of the maniacal ambitions of a madman.

It is tough to resist a tyrant, Russians did not make it, but I want to believe that Ukrainians will do.
Two years ago, among thousands of others, I was arrested and prosecuted for a peaceful protest. Unfortunately, it changed nothing.
No one has to be a hero, and I don't hold Russian people responsible for what the regime has done to our countries.

But I want to urge the Russian IT community to do at least what little is left in your power to stop this nightmare.
Try to find a way to avoid helping the occupation government.
Move while you still can. Quit working for government companies or projects. Don't help to build surveillance, censorship, and propaganda. Avoid paying taxes to the murderers.

It will be worse if nobody stops it.
568 viewsnne_controll_bot, edited  07:00
Open / Comment
2022-01-20 13:45:09 Awesome Metric Learning
The Metric Learning approach to data science problems is heavily underutilized. There are a lot of academic research papers around it but much fewer practical guides and tutorials.
So we decided that we could help people adopt metric learning by collecting related materials in one place.
We are publishing a curated list of awesome practical metric learning tools, libraries, and materials - https://github.com/qdrant/awesome-metric-learning!
This collection aims to put together references to all relevant materials for building your application using Metric Learning.
If you know some exciting article, helpful tool, or a blog post that helped you apply metric learning - feel free to PR your proposal!
390 viewsnne_controll_bot, 10:45
Open / Comment
2021-12-22 16:17:24 ​​Hi!
Check out the first-handed experience of building neural search solutions with Qdrant from its user in the latest Vector Podcast
488 viewsnne_controll_bot, 13:17
Open / Comment
2021-09-15 14:08:39 ​​ODS.ai Summer of Code results

Hi everyone, ODS SoC has officially finished in the last week, and it is time to present the results.
First of all, the winner of the Metric Learning track Tatiana Grechishcheva has published a detailed article on her work of fine-tuning and deploying metric learning models.
She fine-tuned the ViT model for matching similar clothing and put together a detailed tutorial of how you can deploy such a model to production.
An online demo is also included!

There are also some exciting results on fine-tuning transformers with different types of head layers.
In a nutshell, the result is that it is enough to have only a couple hundred examples to improve the similarity matching result without overfitting.
I will make a separate post about it and further plan on making metric learning practical.
435 viewsnne_controll_bot, 11:08
Open / Comment
2021-07-31 12:16:42 Hello everyone,

The largest Russian-speaking data science community ODS.ai organizes the Summer School, designed after the famous Google SoC, and I participate in it as a mentor for the Metric Learning track with Qdrant.

During the track, participants are challenged to research the fine-tuning for the similarity learning, build a working prototype and contribute to Open Source.

Today at 18:00 MSK, there will be a first meetup of the track. I will be talking about:
- Which datasets are suitable for similarity matching, how can you obtain self-supervised
- Approaches to fine-tuning encoders. Selection of method depending on the amount of available annotation


I invite everyone interested to Spatial Chat ODS, at 18:00 MSK. Password: odssummerofcodeison.

Language of the event: Russian.

Materials will be available in English later on the channel.
546 viewsnne_controll_bot, 09:16
Open / Comment
2021-06-10 11:58:05 Neural Search Step-by-Step

We made a tutorial on Semantic Embeddings and Neural Search. With this guide, you will build your own semantic search service from scratch.

You won't need any complicated training of the neural network. Moreover, you can do all preparation steps in the Google Colab notebook.

Tutorial includes:
- What is the Neural Search?
- Getting embeddings from BERT Encoder
- Using vector search engine Qdrant
- Creating an API server with FastAPI.


If you want to learn how to build projects like this, the tutorial is for you.
437 viewsnne_controll_bot, 08:58
Open / Comment