ML Research Hub
32.9K subscribers
4.45K photos
272 videos
23 files
4.81K links
Advancing research in Machine Learning – practical insights, tools, and techniques for researchers.

Admin: @HusseinSheikho || @Hussein_Sheikho
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
🐈 TTT Long Video Generation 🐈

👉 A novel architecture for video generation, adapting the #CogVideoX 5B model by incorporating #TestTimeTraining (TTT) layers.
Adding TTT layers into a pre-trained Transformer enables generating a one-minute clip from text storyboards.
Videos, code & annotations released 💙

🔗 Review: https://t.ly/mhlTN
📄 Paper: arxiv.org/pdf/2504.05298
🌐 Project: test-time-training.github.io/video-dit
💻 Repo: github.com/test-time-training/ttt-video-dit

#AI #VideoGeneration #MachineLearning #DeepLearning #Transformers #TTT #GenerativeAI

⭐️ BEST DATA SCIENCE CHANNELS ON TELEGRAM ⭐️
Please open Telegram to view this post
VIEW IN TELEGRAM
👍2
End-to-End Test-Time Training for Long Context

📝 Summary:
This paper proposes End-to-End Test-Time Training TTT-E2E for long-context language modeling, treating it as continual learning. It uses a standard Transformer, learning at test time and improving initialization via meta-learning. TTT-E2E scales well and offers constant inference latency, being m...

🔹 Publication Date: Published on Dec 29, 2025

🔹 Paper Links:
• arXiv Page: https://arxivlens.com/PaperView/Details/end-to-end-test-time-training-for-long-context-6176-bf8fd7e6
• PDF: https://arxiv.org/pdf/2512.23675
• Github: https://github.com/test-time-training/e2e

==================================

For more data science resources:
https://t.iss.one/DataScienceT

#TestTimeTraining #LongContext #LanguageModels #Transformers #ContinualLearning