ArtificialIntelligenceArticles
2.96K subscribers
1.64K photos
9 videos
5 files
3.86K links
for who have a passion for -
1. #ArtificialIntelligence
2. Machine Learning
3. Deep Learning
4. #DataScience
5. #Neuroscience

6. #ResearchPapers

7. Related Courses and Ebooks
Download Telegram
Discovering Neural Wirings (https://arxiv.org/abs/1906.00586)

In the past years developing deep neural architectures either required manual design (e.g. AlexNet, ResNet, MobileNet, ...) or require expensive search among possible predefined block structures of layers (NAS, MNas, DART,...). What if we see a neural network as a completely unstructured graph? where each node is running a simple sensing operation over a single data-point or a channel (e.g. 2d filter) and all the nodes are wired up massively in the network. In this paper we explain how to discover a good wiring of a neural network that minimizes the loss function with a limited amount of computation. We relax the typical notion of layers and instead enable channels to form connections independent of each other. This allows for a much larger space of possible networks. The wiring of our network is not fixed during training – as we learn the network parameters we also learn the structure itself.
MelNet: A Generative Model for Audio in the Frequency Domain
Sean Vasquez and Mike Lewis: https://arxiv.org/abs/1906.01083
Blog: https://sjvasquez.github.io/blog/melnet/
#ArtificialIntelligence #DeepLearning #MachineLearning
IoT Network Security from the Perspective of Adversarial Deep Learning. arxiv.org/abs/1906.00076
Best paper award at #CVPR2018 :

"Taskonomy: Disentangling Task Transfer Learning"

Abstract : Do visual tasks have a relationship, or are they unrelated? For instance, could having surface normals simplify estimating the depth of an image? Intuition answers these questions positively, implying existence of a structure among visual tasks. Knowing this structure has notable values; it is the concept underlying transfer learning and provides a principled way for identifying redundancies across tasks, e.g., to seamlessly reuse supervision among related tasks or solve many tasks in one system without piling up the complexity. We proposes a fully computational approach for modeling the structure of space of visual tasks (...).

Paper: https://arxiv.org/pdf/1804.08328.pdf
Data: https://taskonomy.stanford.edu

#award #artificialintelligence #deeplearning #transferlearning
Overlooked No More: Alan Turing never had an obituary in the New York Times.
Until now.
By Alan Cowell: https://www.nytimes.com/2019/06/05/obituaries/alan-turing-overlooked.html
#AlanTuring #ArtificialIntelligence #Mathematics