Deep Gravity
384 subscribers
60 photos
35 videos
17 files
495 links
Download Telegram
Semantic Segmentation of Thigh Muscle using 2.5D #DeepLearning Network Trained with Limited Datasets

Purpose: We propose a 2.5D #deep learning #NeuralNetwork (#DLNN) to automatically classify thigh muscle into 11 classes and evaluate its classification accuracy over 2D and 3D DLNN when trained with limited datasets. Enables operator invariant quantitative assessment of the thigh muscle volume change with respect to the disease progression. Materials and methods: Retrospective datasets consist of 48 thigh volume (TV) cropped from CT DICOM images. Cropped volumes were aligned with femur axis and resample in 2 mm voxel-spacing. Proposed 2.5D DLNN consists of three 2D U-Net trained with axial, coronal and sagittal muscle slices respectively. A voting algorithm was used to combine the output of U-Nets to create final segmentation. 2.5D U-Net was trained on PC with 38 TV and the remaining 10 TV were used to evaluate segmentation accuracy of 10 classes within Thigh. The result segmentation of both left and right thigh were de-cropped to original CT volume space. Finally, segmentation accuracies were compared between proposed DLNN and 2D/3D U-Net. Results: Average segmentation DSC score accuracy of all classes with 2.5D U-Net as 91.18 mean DSC score for 2D U-Net was 3.3 DSC score of 3D U-Net was 5.7 same datasets. Conclusion: We achieved a faster computationally efficient and automatic segmentation of thigh muscle into 11 classes with reasonable accuracy. Enables quantitative evaluation of muscle atrophy with disease progression.

Link

🔭 @DeepGravity
#DeepLearning models tend to increase their accuracy with the increasing amount of training data, where’s traditional #MachineLearning models such as #SVM and Naive #Bayes classifier stop improving after a saturation point.

Link

🔭 @DeepGravity
A very interesting paper by #Harvard University and #OpenAI

#DeepDoubleDescent: WHERE BIGGER MODELS AND MORE DATA HURT

ABSTRACT
We show that a variety of modern deep learning tasks exhibit a “double-descent” phenomenon where, as we increase model size, performance first gets worse and then gets better. Moreover, we show that double descent occurs not just as a function of model size, but also as a function of the number of training epochs. We unify the above phenomena by defining a new complexity measure we call the effective model complexity and conjecture a generalized double descent with respect to this measure. Furthermore, our notion of model complexity allows us to identify certain regimes where increasing (even quadrupling) the number of train samples actually hurts test performance.

Paper

Related article

#DeepLearning

🔭 @DeepGravity
Deciphering interaction fingerprints from protein molecular surfaces using geometric #DeepLearning

Abstract
Predicting interactions between proteins and other biomolecules solely based on structure remains a challenge in biology. A high-level representation of protein structure, the molecular surface, displays patterns of chemical and geometric features that fingerprint a protein’s modes of interactions with other biomolecules. We hypothesize that proteins participating in similar interactions may share common fingerprints, independent of their evolutionary history. Fingerprints may be difficult to grasp by visual analysis but could be learned from large-scale datasets. We present MaSIF (molecular surface interaction fingerprinting), a conceptual framework based on a geometric deep learning method to capture fingerprints that are important for specific biomolecular interactions. We showcase MaSIF with three prediction challenges: protein pocket-ligand prediction, protein–protein interaction site prediction and ultrafast scanning of protein surfaces for prediction of protein–protein complexes. We anticipate that our conceptual framework will lead to improvements in our understanding of protein function and design.

Paper

🔭 @DeepGravity
Security of #DeepLearning Methodologies: Challenges and Opportunities

Despite the plethora of studies about security vulnerabilities and defenses of deep learning models, security aspects of deep learning methodologies, such as transfer learning, have been rarely studied. In this article, we highlight the security challenges and research opportunities of these methodologies, focusing on vulnerabilities and attacks unique to them.

Paper

🔭 @DeepGravity
The year in AI: 2019 #ML / #AI advances recap

It has become somewhat of a tradition for me to do an end-of-year retrospective of advances in AI/ML (see last year’s round up for example), so here we go again! This year started with a big recognition to the impact of #DeepLearning when #Hinton, #Bengio, and #Lecun were awarded the #Turing award.

Link

🔭 @DeepGravity
Seven differences between academia and industry for building machine learning and #deepLearning models

1) Approach to accuracy
2) Training vs serving
3) Emphasis on Engineering
4) Less emphasis on larger models
5) Understanding the baseline
6) Understanding the intricacies of data
7) Focusing on deep learning too early

Link

🔭 @DeepGravity
#TensorFlow 2 Tutorial: Get Started in #DeepLearning With tf.keras

After completing this tutorial, you will know:

The difference between Keras and tf.keras and how to install and confirm TensorFlow is working.
The 5-step life-cycle of tf.keras models and how to use the sequential and functional APIs.
How to develop MLP, CNN, and RNN models with tf.keras for regression, classification, and time series forecasting.
How to use the advanced features of the tf.keras API to inspect and diagnose your model.
How to improve the performance of your tf.keras model by reducing overfitting and accelerating training.

#Keras

Link

🔭 @DeepGravity
During the last two days, some famous #MachineLearning researchers elucidated their own definition of #DeepLearning. You might check the related links to read full definitions and discussions on each.

Yann LeCun:
#DL is constructing networks of parameterized functional modules & training them from examples using gradient-based optimization. That's it.
This definition is orthogonal to the learning paradigm: reinforcement, supervised, or self-supervised.
https://www.facebook.com/722677142/posts/10156463919392143/

Andriy Burkov:
Looks like in late 2019, people still need a definition of deep learning, so here's mine: deep learning is finding parameters of a nested parametrized non-linear function by minimizing an example-based differentiable cost function using gradient descent.
https://www.linkedin.com/posts/andriyburkov_looks-like-in-late-2019-people-still-need-activity-6615377527147941888-ce68/

François Chollet:
Deep learning refers to an approach to representation learning where your model is a chain of modules (typically a stack / pyramid, hence the notion of depth), each of which could serve as a standalone feature extractor if trained as such.
https://twitter.com/fchollet/status/1210031900695449600

Link

🔭 @DeepGravity
Dive into Deep Learning

An interactive #DeepLearning #book with code, math, and discussions, based on the #NumPy interface.

Book

🔭 @DeepGravity
درود بر همه‌ی شما دوستان گرامی،
امیدوارم این روزهای سخت بهاری به زودی با چیرگی سبزی بر سیاهی سپری بشه. هر چند اندوهش هرگز از یادها نخواهد رفت.

به منظور بررسی ابعاد بحران #کرونا از نگاه #ماشین_لرنینگ، قصد دارم به کمک شما عزیزان یک جلسه‌ی هم‌اندیشی آنلاین رو راه‌اندازی کنم. در لینک زیر زمان‌های مختلفی رو می‌بینین. لطفا زمانی که برای شما مناسب‌تره رو انتخاب کنین که تو اون تایم از طریق زوم یا گوگل میت بتونیم دور هم جمع بشیم. سعی کردم گزینه‌ها رو بین صبح و عصر و شب پخش کنم که با توجه به اختلاف ساعت‌ها بتونیم تایم مشترکی رو پیدا کنیم:

https://doodle.com/poll/69fvgkegwq3y8p6w

هدف این جلسه بیشتر هم اندیشی و به اشتراک گذاری دانسته‌ها و داشته‌ها ست. من خودم دو تا رپو آماده کردم که در موردشون توضیح خواهم داد.
(هدف مقاله دادن یا کار اقتصادی کردن نیست)

امیدوارم ما هم بتونیم در کنار تیم درمان، کمکی برای کشور (و شاید دنیا) در این شرایط باشیم.

اگه پیشنهادی هم دارین، لطفا در کامنت یا به صورت خصوصی پیام بذارین.

ارادتمند
#ai #computervision #machinelearning #deeplearning #covid19

@Reza

🔭 @DeepGravity