MACHINE LEARNING

Paper Notes: The Shattered Gradients Problem ...

paper: https://arxiv.org/abs/1702.08591 The whole heading of the paper is "The Shattered Gradients Problem: If resnets are the answer, then what is the question?". It is really interesting work with all its findings about gradient dynamics of neural networks. It also examines Batch Normalization (BN) and Residual Networks (Resnet) under this problem. The problem, dubbed "Shattered Gradients", described as … Continue reading Paper Notes: The Shattered Gradients Problem ...

Posted in Computer Vision, Machine Learning, Research Notes | Tagged , , , , , , | Leave a comment

Dilated Convolution

In simple terms, dilated convolution is just a convolution applied to input with defined gaps. With this definitions, given our input is an 2D image, dilation rate k=1 is normal convolution and k=2 means skipping one pixel per input and k=4 means skipping 3 pixels. The best to see the figures below with the same k … Continue reading Dilated Convolution

Posted in Computer Vision, Machine Learning, Research Notes | Tagged , , , , , | Leave a comment

Ensembling Against Adversarial Instances

What is Adversarial? Machine learning is everywhere and we are amazed with capabilities of these algorithms. However, they are not great and sometimes they behave so dumb.  For instance, let's consider an image recognition model. This model  induces really high empirical performance and it works great for normal images. Nevertheless, it might fail when you change … Continue reading Ensembling Against Adversarial Instances

Posted in Machine Learning, Research, Research Notes | Tagged , , , , | 6 Comments

Paper Notes: Intriguing Properties of Neural Networks

Paper: https://arxiv.org/abs/1312.6199 This paper studies description of semantic information with higher level units of an network and blind spot of the network models againt adversarial instances. They illustrate the learned semantics inferring maximally activating instances per unit. They also interpret the effect of adversarial examples and their generalization on different network architectures and datasets. Findings … Continue reading Paper Notes: Intriguing Properties of Neural Networks

Posted in Machine Learning, Research | Tagged , , , | 2 Comments

Short guide to deploy Machine Learning

Suppose you have a problem that you like to tackle with machine learning and use the resulting system in a real-life project.  I like to share my simple pathway for such purpose, in order to provide a basic guide to beginners and keep these things as a reminder to myself. These rules are tricky since even-thought … Continue reading Short guide to deploy Machine Learning

Posted in Machine Learning, path-way | Tagged , , , | Leave a comment

Selfai: A Method for Understanding Beauty in Selfies

Selfies are everywhere. With different fun masks, poses and filters,  it goes crazy.  When we coincide with any of these selfies, we automatically give an intuitive score regarding the quality and beauty of the selfie. However, it is not really possible to describe what makes a beautiful selfie. There are some obvious attributes but they are not fully prescribed. With … Continue reading Selfai: A Method for Understanding Beauty in Selfies

Posted in Computer Vision, Machine Learning, Research, Research Notes | Tagged , , , , | Leave a comment

Paper review - Understanding Deep Learning Requires Rethinking Generalization

Paper: https://arxiv.org/pdf/1611.03530v1.pdf This paper states the following phrase. Traditional machine learning frameworks (VC dimensions, Rademacher complexity etc.) trying to explain how learning occurs are not very explanatory for the success of deep learning models and we need more understanding looking from different perspectives. They rely on following empirical observations; Deep networks are able to learn … Continue reading Paper review - Understanding Deep Learning Requires Rethinking Generalization

Posted in Machine Learning, Research, Research Notes | Tagged , , | 1 Comment

Important Nuances to Train Deep Learning Models.

A crucial problem in a real DL system design is to capture test data distribution with the trained model which only sees the training data distribution.  Therefore, it is always important to find a good data splitting scheme which at least gives the right measures to such divergence. It is always a waste to spend … Continue reading Important Nuances to Train Deep Learning Models.

Posted in Machine Learning, Research Notes | Tagged , , , , | Leave a comment

Face Detection by Literature

Please ping me if you know something more. Multi-view Face Detection Using Deep Convolutional Neural Network Train face classifier with face (> 0.5 overlap) and background (<0.5 overlap) images.  Compute heatmap over test image scaled to different sizes with sliding window  Apply NMS .  Computation intensive, especially for CPU.  http://arxiv.org/abs/1502.02766   From Facial Parts Responses … Continue reading Face Detection by Literature

Posted in Computer Vision, Machine Learning, Research Notes | Tagged , | Leave a comment

Why do we need better word representations ?

A successful AI agent should communicate. It is all about language. It should understand and explain itself in words in order to communicate us.  All of these spark with the "meaning" of words which the atomic part of human-wise communication. This is one of the fundamental problems of Natural Language Processing (NLP). "meaning" is described … Continue reading Why do we need better word representations ?

Posted in Machine Learning, Research, Research Notes | Tagged , , | Leave a comment