Project Progress

Week 15 (21st Apr - 27th Apr 2020)

  1. Readings:

    1. An Empirical Study of Example Forgetting during Deep Neural Network Learning.

    2. Do Better ImageNet Models Transfer Better?

    3. When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks.(Camera ready version)

    4. Benchmarking Neural Network Robustness to Common Corruptions and Surface Variations.

  2. Making the outline structure of the paper.

  3. Making various timelines for the submission.

  4. Running more experiments.

  5. Thinking about what to do with nas meets robustness paper, as we cannot compare with them because they follow a different experimental settings.

Week 14 (14th Apr - 20th Apr 2020)

  1. Working on ensemble NAS MAML idea.

  2. Looking at ways to present CVPR_NASWS submission as a complete work for ICML workshop.

  3. Looking at recently published experimental papers to extend the workshop work.

  4. Running more experiments on different datasets to decide the trajectory of the workshop paper.

  5. Held meetings with Gaurav regarding the same.

Week 13 (7th Apr - 13th Apr 2020)

  1. Discussing the ideas for ensemble + NAS work.

  2. Reading various papers related to MAML, Ensemble, searching in data space, auto augmenting data.

  3. We decided to focus on the main goal of increasing the speed of NAS on large scale datasets. We plan to use bagging and MAML is some way to achieve this

Week 12 (31st Mar - 6 Apr 2020)

  1. Reading Efficient Forward Architecture Search.

    1. Main Idea: Start from a small network and add layers based on the requirement.

    2. Allows you to start from what we already (densents, resnets are build with human expertise) have and build on top of it.

    3. Inspired by Cascade-correlation and gradient boosting.

    4. Proposed method Petridish can be used for both cell-search (a.k.a micro-search) and macro-search.

  2. Reading about random forests, bagging, and other ensemble methods.

  3. Ideating and discussing about how to combine NAS and ensemble.

Week 11 (24th Mar - 30 Mar 2020)

  1. Reading Efficient Forward Architecture Search.

  2. Reading about different ensemble methods watching some video lectures.

  3. Cleaning the code and uploading it anonymously for CVPR_NASWS.

  4. Thinking about ideas regarding future work.

Week 10 (17th Mar - 23 Mar 2020)

  1. Running experiments for the CVPR_NASWS.

  2. Completing the CVPR_NASWS paper and submitting it.

  3. Meeting: Discussing about ensemble and NAS.

  4. TODO: Read Efficient Forward Architecture Search paper

Week 9 (10th Mar - 16th Mar 2020)

  1. Running experiments for the CVPR_NASWS.

  2. Writing the 1st end to end submission.

Week 8 (3rd Mar - 9th Mar 2020)

  1. Running Adversarial Experiments on CIFAR-10, Imagenet on Resnet, Densnet, VGG. Inception net, PDARTS, DARTS, NSGA net.

  2. Writing the initial draft for CVPR_NASWS, discussed with sir, there were some changes that need to be made, started working on them.

  3. Reading/Overviewing these papers:

    1. Evolving Robust Neural Architectures to Defendfrom Adversarial Attacks

    2. Understanding and Robustifying Differentiable Architecture Search

    3. XNAS: Neural Architecture Search with Expert Advice

    4. Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

    5. Searching for A Robust Neural Architecture in Four GPU Hours

    6. Learning Resnet Blocks using Boosting Theory paper

    7. Deep Neural Network Ensembles against Deception: Ensemble Diversity, Accuracy and Robustness

    8. Provably Robust Boosted Decision Stumps and Trees against Adversarial Attacks

    9. Coupled Ensembles of Neural Networks

    10. NAS evaluation is frustratingly hard

    11. When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks.

    12. Improving Neural Architecture Search Image Classifiers via Ensemble Learning

    13. AdaNet: Adaptive Structural Learning of Artificial Neural Networks

    14. Sub-Architecture Ensemble Pruning in Neural Architecture Search

Week 7 (25th Feb - 2nd Mar 2020)

  1. Completed the draft.

  2. Made an outline of the stuff that needs to be done before completing the CVPR_NASWS paper:

    1. List the papers that need to be discussed and cited for the NAS workshop submission.

    2. Experiments:

      1. Common adversarial attacks on 3 three standard architectures (Resnet, Densenet, VGG) and 2-3 NAS architectures (Pick from NAS eval is frustratingly hard paper)

      2. Can we reproduce the results (that’ll make our argument better) of Improving NAS or When NAS meets robustness papers.

    3. Can we simply make an ensemble kind of architecture from existing NAS papers.

    4. Find 3-4 architectures, make ensemble and prune each of them.

  3. Summarized papers for writing CVPR_NASWS:

    1. Provably Robust Boosted Decision Stumps and Trees against Adversarial Attacks.

    2. Deep Neural Network Ensembles against Deception: Ensemble Diversity, Accuracy and RobustnessLing Liu, Wenqi Wei, Ka-Ho Chow, Margaret Loper, Emre Gursoy, Stacey Truex, Yanzhao Wu.

    3. Learning Resnet Blocks using Boosting Theory paper.

  4. Ran some experiments on adversarial robustness and existing methods.

Week 6 (18th Feb - 24th Feb 2020)

  1. Started working on the draft for the paper.

  2. Setting up some initial experiments related to adversarial robustness and running some of the existing NAS techniques(P-DARTS)

  3. Thought about the NAS + ensemble ideas and ideas for submission in the CVPR NAS workshop, ideas:

    1. Ensemble models are generally more robust and accurate than the single architecture based models (cite Improving NAS Paper)

    2. Ensembles will help leverage the power of different architectures (one architecture can have skip connections, others can have dense connections, etc)

    3. All the existing DNN ensembles are handcrafted, using NAS to directly find ensembles might open new doors and might help in improving the performance, the robustness of neural network architectures.

    4. Each of the sub-architectures in an ensemble can be used to learn different batches (or classes) of data. Which may be helpful in building specialized architectures for problems like few-shot, zero-shot, etc.

    5. Show that most of the existing NAS based architectures are less robust (running 1-2 experiments on existing benchmarks) → As a result finding ensembles using NAS is an important problem.

    6. Using ensembles can decrease the search space complexity. For example, As shown in the figure below, we can choose to fix a part of the network to be constant for all the architectures in the ensemble and do NAS to find the last few layers for each architecture.

 

Week 5 (11th Feb - 17th Feb 2020)

  1. Thinking about ideas relevant to NAS + ensembling.

  2. Reviewing some of the already read papers.

  3. Reading reviews of some of the NAS papers like https:openreview.net/forum?id=HygrdpVKvr

  4. Reading about ensembling techniques and looking at some relevant maths.

Week 4 (4th Feb - 10th Feb 2020)

  1. Readings:

    1. AdaNet: Adaptive Structural Learning of Artificial Neural Networks

    2. Learning Deep ResNet Blocks Sequentially using Boosting Theory

    3. Improving Neural Architecture Search Image Classifiers via Ensemble Learning

  2. Todo:

    1. We were planning to replace boosting used in Improving NAS paper with bagging and other ensembling methods, for this we'll have to first read AdaNet paper thoroughly.

    2. Go through recent DNN+Ensembling methods survey paper.

    3. We have two NAS + Ensembling papers, both of these are not accepted anywhere. We'll go through these thoroughly and try to find out the reasons for not getting accepted.

    4. Write the draft for paper and submit.

Week 3 (28th Jan - 3rd Feb 2020)

  1. Readings:

    1. Improving Neural Architecture Search Image Classifiers via Ensemble Learning

    2. Sub-Architecture Ensemble Pruning in Neural Architecture Search

    3. NAS Evaluation Is Frustratingly Hard

  2. Minutes of meeting with Vineeth Sir:

    1. Ideas discussed:

      1. Training a neural network by training sub-architectures separately

      2. DNN accelerators focus on reducing the energy consumed during inference which is majorly governed by the data movement, designing DNNs that optimize the data movement based on architecture.

      3. NAS for MAML

      4. Treat a Resnet like architecture as Supernet + MAML

      5. Gradient Boosting + Improving NAS paper, (Ensemble method along with NAS in general)

  3. Todo (Next Steps):

    1. Decide on which idea to pursue next.

    2. Read the above mentioned two papers and see if we can replace boosting with Bagging

    3. Read: Learning Resnet Blocks using Boosting Theory paper

Week 2 (21st Jan - 27th Jan 2020)

  1. Readings:

    1. Neural Architecture Search: A Survey

    2. Best Practices for Scientific Research on Neural Architecture Search

    3. AdversarialNAS: Adversarial Neural Architecture Search for GANs

    4. AutoGAN: Neural Architecture Search for Generative Adversarial Networks

    5. Evolving Robust Neural Architectures to Defend from Adversarial Attacks

    6. Adversarial Robustness vs. Model Compression, or Both?

    7. When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

    8. Adversarial Robustness of Pruned Neural Networks

  2. Presentation:

    1. When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

    2. Network Pruning via Transformable Architecture Search

  3. Todo (Next Steps):

    1. The problem statement we currently have seems to overlap with many existing papers.

    2. Brainstorm and finalize the problem statement this week

Week 1 (13th Jan - 20th Jan 2020)

  1. Building background on adversarial networks:

    1. Readings: Intriguing Properties of Adversarial Examples

    2. Videos Watched:

      1. Adversarial Examples and Training, CS231n

      2. Adversarial Machine Learning, ICLR 2019 talk

  2. Building background on NAS:

    1. Readings:

      1. DARTS: Differentiable Architecture Search

      2. Progressive Neural Architecture Search

      3. Efficient Neural Architecture Search using Parameter Sharing

    2. Videos Watched:

      1. NAS SOTA review talk, MSR

      2. Efficient Forward Architecture Search

  3. Finding papers relevant to NAS and Adversarial attacks, found six papers.

  4. Papers Read relevant to NAS and Adversarial attacks:

    1. Network pruning via transformable architecture Search

    2. When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks

  5. Some ideas in the brainstorming session:

    1. NAS with a budget constraint (in terms of parameters and layers)

    2. NAS for coming up with adversarially robust architectures

    3. NAS in transfer learning, GANs, Multi-task, and Multi-Objective learning

    4. NAS for developing architectures based on the task at hand (One-shot, Few shot etc)

  6. Todo (Next Steps)

    1. We decided to work on “NAS for coming up with adversarially robust architectures” problem, but we may have to tweak the problem statement a little bit as there are a few new papers on this topic.

    2. Read the remaining papers.

    3. Finalise the initial idea.