Telegram Web
πŸ–₯ Data structures in Python - cheat sheet

Keep a powerful cheat sheet on data structures in Python; Everything is explained here with examples, so it will be crystal clear
Concepts such as mutability, immutability are described, things like list comprehensions and much more are described.

πŸ“Ž Crib

βœ…οΈ http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM
⚑️ Big ML cheat sheet

Here you will find the basic theory of Machine Learning and examples of the implementation of specific ML algorithms - in general, this is just the thing to brush up on your knowledge before the interview.

πŸ“Ž Crib

βœ…οΈ http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM
⚑️ Neural Networks: Zero to Hero - a block of 8 lectures and practical exercises from Andrey Karpati

This is a course on neural networks from the very basics, perhaps the best on the entire Internet.
The course is a series of YouTube videos in which Karpathy shows how to design and train neural networks.
All these results are written in Jupyter-Notebooks, you can download them and experiment

▢️ Neural Networks: Zero to Hero

βœ…οΈ http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM
🌟 Megatron-Core - PyTorch library for training Transformers

docker run --ipc=host --shm-size=512m --gpus all -it nvcr.io/nvidia/pytorch:24.02-py3

pip install megatron_core
pip install tensorstore==0.1.45
pip install zarr


Megatron-Core is a self-contained, lightweight PyTorch library that contains everything you need to train Transformers.
Offers a large collection of GPU techniques for optimizing memory and calculations, uses a lot of developments from Megatron-LM and Transformer Engine.

Megatron-Core provides flexibility for developers and makes it easy to develop their own LLM framework on NVIDIA computing infrastructure.

πŸ–₯ GitHub
🟑 Docks

βœ…οΈ http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM
This media is not supported in your browser
VIEW IN TELEGRAM
🧰 A comprehensive toolbox of data scientists
βœ… Access to 500+ TB of data science content

πŸ–₯ The Data Scientist Toolkit is the result of my five-year effort in the field of data science, which is an extremely comprehensive and extensive resource for those who want to become a professional data scientist. So don't miss this source

βš™οΈ This toolbox includes the following sections:

πŸ“š Access to a wide bank of scientific files, training courses and professional resources in the field of data science.

πŸ’― More than a decade of applied data science theses in finance, medicine, logistics and security.

πŸ› The latest data science courses from leading universities in the world such as Stanford, MIT and Berkeley.

πŸš€ And over 300+ terabytes of data science courses for experienced data scientists.

β”Œ
🐱 GitHub Repository
β””
⏩ The Data Scientist's Toolbox

🌐 http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
This media is not supported in your browser
VIEW IN TELEGRAM
πŸ“„ The best free courses to learn data science


β”Œ 🏷 CS229: Machine Learning
β””
βœ…οΈ LINK

β”Œ
🏷 MIT: Linear Algebra
β””
βœ…οΈ LINK

β”Œ
🏷 MIT: Introduction to Algorithms
β””
βœ…οΈ LINK

β”Œ
🏷 MIT: Applied Probability
β””
βœ…οΈ LINK

β”Œ
🏷 Stanford: Relational Databases & SQL
β””
βœ…οΈ LINK

🌐 http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
A collection of completely free neural nets on hugginface

You can use them to do cool photo upscaling, remove backgrounds, edit images and more.

All the models are free and opensource here: huggingface.

🌐 http://www.tgoop.com/codeprogrammer βœ…οΈ
Please open Telegram to view this post
VIEW IN TELEGRAM
2025/07/08 09:58:00
Back to Top
HTML Embed Code: