Here is a daily log for Haihao Liu from 2016.6.5 to 2016.8.7 in Saito Lab in Tohoku Univ.
Daily schedule (tentative)†
- 10:00-12:00 Finishing/continuing any work from previous day
- 12:00-13:00 Lunch
- 13:00-14:30 Discussion with Shoufie-san
- 14:30-15:00 Prepare daily report presentation
- 15:00-15:30 Meeting with Saito-sensei
- 15:30-18:00 Continuing work/updating Pukiwiki
Goal of the project†
- To enhance electric field at graphene surface by changing the possible patterns of dielectric thin layers through deep learning algorithms.
- Keywords: graphene, transfer matrix, deep learning.
Questions and Answers†
This section is for posting questions from Haihao-san and answers from other group members.
- Please list here with some simple reasons or details.
- For every problem, give a tag double asterisks (**) in the code so that it will appear in the table of contents.
- For the answer, give a tag triple asterisks (***) in the code below the problem in order to make a proper alignment.
- List from new to old.
Q: (Placeholder)†
A: (Placeholder)†
Report†
This part is basically written by Haihao-san. Any other people can add this. Here the information should be from new to old so that we do not need to scroll.
June 22†
- Created frequency vs discrete T plot, and T vs Gray code plot for 4 layers
- Identified sequences with identical transmission spectra, found symmetry (e.g. 1101 = 1011)
- Investigated effect of flipping one bit on E field for 4 layer system, still looking at and trying to interpret results (but does not seem to be chaotic at least)
To do:
- Prove using induction that mirrored sequences give identical transmission
June 21†
- Talked to CS major friend about problem, suggested hill climbing and genetic algorithms
- Doesn't think there is a need to use machine learning to predict transmission, because we already have a program to do it exactly! ML is only for when you don't know how to program an algorithm to solve the problem
- Agreed that truly chaotic behavior would make any algorithm, including neural networks, not work (imagine predicting a RNG)
- Tried to see what effect of changing one bit really is
- Found that transmission probability is discrete!
- Changing one bit can only move to one level above or below
- Investigated effect of number of layer and changing dielectric constants, levels only depend on ratio between dielectric constants
To do:
- Create frequency vs discrete T plot, and T vs Gray code plot for small number of layers
June 20†
- Found hole/security flaw in SquirrelMail webmail
- Wrote abstract for Zao meeting
- Researched GPUs for deep learning
- Deployed catnet on Matlab, initial tests were big success
- Figured out MNIST wasn't working because of simple division/rescaling of input, not sure if working
- Learned how to use Matlab read/write to file, played with different formatting for saving data
To do:
- Create database of transmission spectra and E field plots
- Compile training data using random 100-layer sequences?
- Try to train with catnet (imagine)
- Talk to CS friend about how to implement, what type of neural network
June 18†
- Sendai castle adventure
- Performance group
June 17†
- Finally got MatCaffe working, MNIST does not seem to be working
- Installed Caffe (/liu/caffe) on tube61 (Ubuntu 15.04) with ATLAS, CPU-only, ran catnet, still slow, maybe even slower than Mac! (~6 min/20 iterations)
- Moved cat images to flex, continued training
- Installed OpenBLAS on tube61 to try to rebuild Caffe with OpenBLAS
- Installed Intel MKL (commercial, got free student license) on tube60, installed caffe (/liu/research/caffe) on tube 60 (Ubuntu 12.04) with MKL, CPU-only, not yet tested catnet
To do:
- Rebuild with OpenBLAS
- Finish training catnet
- Look into OpenMP for parallel processing support, OpenMP+MKL gives performance comparable to GPU
June 16†
- Downloaded 7000+ cat and dog images
- Training catnet is slow! 5 min/20 iter (1000 in total)
- Downloaded MNIST dataset, trained in about ~10 mins
- Python wrapper installed properly, MATLAB not working, so can't yet deploy
To do:
- Test MNIST by deploying on either Matlab or Python
- Look into GPUs, Saito-sensei might buy one to install on lab server
June 15†
- Talked with Shoufie-san about basics of neural networks, showed him the power of genetic algorithms, discussed ideas on how to use networks to find/generate good sequences
- Continued troubleshooting constant errors, build Caffe, but make runtest failing
- Found that laptop's GPU is too old, not powerful enough, built in CPU-only, finally passed all tests
To do:
- Learn how to use Caffe, run example programs MNIST and catnet
- Install MATLAB and Python wrappers/interface
June 14†
- Installed Homebrew, Miniconda (lightweight Anaconda Python distro)
- Installed CUDA, other libraries Caffe has as dependencies
- Saito-sensei invited me to join Zao NanoCarbon Meeting, gladly accepted!
To do:
June 13†
- Found that integral based error function never gave good score, max ~30 on linear conversion scale for 10 layer system
- Experimented with designing evaluation function based on Q factor for transmission, enhancement and position for E field
- Watching some videos online to learn the basics of how deep learning is implemented, as well as see more examples.
To do:
- File read/write to avoid recalculating every time
- Install Caffe
June 10†
- Adapted MATLAB programs calculate transfer matrix and plot transmission and enhancement for any arbitrary sequence
- Wrote error function
- Lab party!
To do:
- Evaluate scaling functions (0-100)
June 9†
- Wrote MATLAB program to plot intensity of E field in Fibonacci lattice as function of position (z), looked at positions of enhancement
To do:
- Adapt code to calculate transfer matrix and plot transmission and enhancement for any arbitrary sequence
- Design error function (0-100 scale) to measure how close a given transmission or enhancement spectrum is to a target spectrum
- Look into Caffe machine learning
June 8†
- Wrote MATLAB program to construct transfer matrix for n-th Fibonacci lattice
- Plotted transmission probability as a function of frequency, 10th gen made a nice looking fractal
To do:
- Clean up MATLAB code
- Plot E field in Fibonacci lattice as function of distance (z), see points of most enhancement
June 7†
- Walked to campus, took around 30 mins
- Found explicit formula for n-th Fibonacci number, and wrote Python program to generate n-th iteration of the Fibonacci fractal
- Had lunch with Hasdeo-san, bought lunch by weight from cafeteria (entrees 1.4 yen/g, rice .43 yen/g)
- Derived matching (boundary) and propagation matrices in transfer matrix method, used to calculate R and T probabilities for comparison
- Derived E and H relations from Maxwell's equations (Ampere's)
- Introduced myself at group meeting, shared some omiyage from China
- Set up printer over LAN, had to install driver manually
To do:
- Check that theory agrees with Snell's Law
- Construct transfer matrix for n-th Fibonacci lattice
June 6†
- Shoufie-san picked me up from Urban Castle Kawauchi, took me to campus by subway (International Center -> Aobayama, 200 yen ~20 mins total)
- Nugraha-sensei helped me set up lab server access, mail client, etc.
- Bento lunch from Espace Ouvert restaurant (next to 7-11 on bottom floor of new building), ate with Saito-sensei who played his ukulele
- Learned how to derive boundary conditions for EM wave passing from one dielectric media to another, using Maxwell's equations (Faraday's and Ampere's)
- Solved for reflection and transmission probabilities at boundary for normal incident wave
To do:
- Test SHH from dorm room
- Read pages on reflection and transmission of EM waves at boundary, and derive probabilities for both TE and TM both of oblique incident waves