Here is a daily log for Haihao Liu from 2016.6.5 to 2016.8.7 in Saito Lab in Tohoku Univ.

#contents

*Daily schedule (tentative) [#schedule]

- 10:00-12:00 Finishing/continuing any work from previous day
- 12:00-13:00 Lunch
- 13:00-14:30 Discussion with Shoufie-san
- 14:30-15:00 Prepare daily report presentation
- 15:00-15:30 Meeting with Saito-sensei
- 15:30-18:00 Continuing work/updating Pukiwiki

*Goal of the project [#goal]

- To enhance electric field at graphene surface by changing the possible patterns of dielectric thin layers through deep learning algorithms.
- Keywords: graphene, transfer matrix, deep learning

*Questions and Answers [#QA]

This section is for posting questions from Haihao-san and answers from other group members.

- Please list here with some simple reasons or details.
- For every problem, give a tag double asterisks (**) in the code so that it will appear in the table of contents.
- For the answer, give a tag triple asterisks (***) in the code below the problem in order to make a proper alignment.
- List from new to old.

**Q: (Placeholder) [#Q1]
***A: (Placeholder) [#A1]

*Report [#report]

This part is basically written by Haihao-san. Any other people can add this. Here the information should be from new to old so that we do not need to scroll.

**July 11 [#july11]

- Found 10 conjugacy classes
- Determined isomorphic to D8 x Z2

**July 10 [#july10]

- Visited Matsushima, took boat cruise from Hon-Shiogama

**July 8 [#july8]

- Constructed multiplication table for 4 layers

To do:
- Construct character table for 4 layers

**July 7 [#july7]

- Found formula for number of patterns in each T value for both even and odd # of layers
- Tried making group of symmetry operations on 2-layer lattices, can't include all operations
- Created multiplication table on restricted operations
- Created character table, does not match degeneracy

To do:
- Find basis for each irreducible representation using projection operator
- Construct multiplication table for 4-layer symmetry operations, adding elements as necessary

**July 3-6 [#july36]

- Mid-program meeting in Kyoto

**July 1 [#july1]

- Unable to connect to new drive once again...
- Booted successful by connecting drive via USB, temporary solution, need new HD cable
- Finally able to work on mid-program meeting presentation
- Looked for pattern in number of sequences at each T value

**June 30 [#june30]

- Could not install OS, Mac wouldn?t even recognize presence of new hard drive
- After ~15 times opening up Mac, adjusting connection, trying with old HDD, following guides and advice online like using tape, FINALLY some combination of the various fixes worked
- Finally installed OS, restored all my files from Time Machine backup
- Slowly reinstalling apps as needed, less space now

- Meanwhile finally found last two hidden symmetries, to explain all operations to get to different patterns with same T!!
- Arbitrary permutation of double layers, pair inversion
- Can explain N/2 + 1 possible T for even layers using generalized concept of pairs: two of the same separated by even number of layers

To do:
- Make presentation and practice
- Prove total inverse, double permutation, and pair inversion symmetries (probably using fact that P^2 = -Id)
- Find all products of operations, adding elements to make closed group
- Find mathematical pattern for number of sequences in each T for arbitrary N-layer
- Proofread Shoufie-san?s paper

**June 29 [#june29]

- OS install failed, continued trying multitude of things to repair disk, but no matter what unable to repair
- Concluded must be hard drive failure
- BUT saito-sensei gave me a new one! Almost exact same, same brand/year, 320 vs old 500 gb, helped me install
- Discussed mid-program presentation

To do:
- Install OS on new drive and restore files

**June 28 [#june28]

- Computer crashed! Working on repairing computer
- Main partition seems corrupted, reinstalling OS
- Proved cyclic symmetry, as well as several smaller lemmas and corollaries

To do:
- FIX MAC ASAP

**June 27 [#june27]

- Worked on proving cyclic symmetry, found several useful lemmas e.g. regarding product of "chain" matching matrices
- Looking for last hidden symmetry/ies

**June 26 [#june26]

- Visited Sendai Mediatheque (public library/exhibition space)

**June 24 [#june24]

- Found cyclic symmetry
- Found pattern in number of T values: N + 1 for odd, N/2 + 1 for even
- Calculated eigenfunctions and eigenvectors of TMs, will require further investigation

To do:
- Read first 3 chapters of graph theory textbook
- Prove cyclic symmetry

**June 23 [#june23]

- Proved mirror symmetry
- Searched for patterns
- Even and odd have different behavior
- Even has inversion (A<->B) symmetry

**June 22 [#june22]

- Created frequency vs discrete T plot, and T vs Gray code plot for 4 layers
- Identified sequences with identical transmission spectra, found symmetry (e.g. 1101 = 1011)
- Investigated effect of flipping one bit on E field for 4 layer system, still looking at and trying to interpret results (but does not seem to be chaotic at least)

To do:
- Prove using induction that mirrored sequences give identical transmission

**June 21 [#june21]

- Talked to CS major friend about problem, suggested hill climbing and genetic algorithms
- Doesn't think there is a need to use machine learning to predict transmission, because we already have a program to do it exactly! ML is only for when you don't know how to program an algorithm to solve the problem
- Agreed that truly chaotic behavior would make any algorithm, including neural networks, not work (imagine predicting a RNG)
- Tried to see what effect of changing one bit really is
- Found that transmission probability is discrete!
- Changing one bit can only move to one level above or below
- Investigated effect of number of layer and changing dielectric constants, levels only depend on ratio between dielectric constants

To do:
- Create frequency vs discrete T plot, and T vs Gray code plot for small number of layers

**June 20 [#june20]

- Found hole/security flaw in SquirrelMail webmail
- Wrote abstract for Zao meeting
- Researched GPUs for deep learning
- Deployed catnet on Matlab, initial tests were big success
- Figured out MNIST wasn't working because of simple division/rescaling of input, not sure if working
- Learned how to use Matlab read/write to file, played with different formatting for saving data

To do:
- Create database of transmission spectra and E field plots
- Compile training data using random 100-layer sequences?
- Try to train with catnet (imagine)
- Talk to CS friend about how to implement, what type of neural network

**June 18 [#june18]

- Sendai castle adventure
- Performance group

**June 17 [#june17]

- Finally got MatCaffe working, MNIST does not seem to be working
- Installed Caffe (/liu/caffe) on tube61 (Ubuntu 15.04) with ATLAS, CPU-only, ran catnet, still slow, maybe even slower than Mac! (~6 min/20 iterations)
- Moved cat images to flex, continued training
- Installed OpenBLAS on tube61 to try to rebuild Caffe with OpenBLAS
- Installed Intel MKL (commercial, got free student license) on tube60, installed caffe (/liu/research/caffe) on tube 60 (Ubuntu 12.04) with MKL, CPU-only, not yet tested catnet

To do:
- Rebuild with OpenBLAS
- Finish training catnet
- Look into OpenMP for parallel processing support, OpenMP+MKL gives performance comparable to GPU

**June 16 [#june16]

- Downloaded 7000+ cat and dog images
- Training catnet is slow! 5 min/20 iter (1000 in total)
- Downloaded MNIST dataset, trained in about ~10 mins
- Python wrapper installed properly, MATLAB not working, so can't yet deploy

To do:
- Test MNIST by deploying on either Matlab or Python
- Look into GPUs, Saito-sensei might buy one to install on lab server

**June 15 [#june15]

- Talked with Shoufie-san about basics of neural networks, showed him the power of genetic algorithms, discussed ideas on how to use networks to find/generate good sequences
- Continued troubleshooting constant errors, build Caffe, but make runtest failing
- Found that laptop's GPU is too old, not powerful enough, built in CPU-only, finally passed all tests

To do:
- Learn how to use Caffe, run example programs MNIST and catnet
- Install MATLAB and Python wrappers/interface

**June 14 [#june14]

- Installed Homebrew, Miniconda (lightweight Anaconda Python distro)
- Installed CUDA, other libraries Caffe has as dependencies
- Saito-sensei invited me to join Zao NanoCarbon Meeting, gladly accepted!

To do:
- Install Caffe

**June 13 [#june13]

- Found that integral based error function never gave good score, max ~30 on linear conversion scale for 10 layer system
- Experimented with designing evaluation function based on Q factor for transmission, enhancement and position for E field
- Watching some videos online to learn the basics of how deep learning is implemented, as well as see more examples.

To do:
- File read/write to avoid recalculating every time
- Install Caffe

**June 10 [#june10]

- Adapted MATLAB programs calculate transfer matrix and plot transmission and enhancement for any arbitrary sequence
- Wrote error function
- Lab party!

To do:
- Evaluate scaling functions (0-100)

**June 9 [#june9]

- Wrote MATLAB program to plot intensity of E field in Fibonacci lattice as function of position (z), looked at positions of enhancement

To do:
- Adapt code to calculate transfer matrix and plot transmission and enhancement for any arbitrary sequence
- Design error function (0-100 scale) to measure how close a given transmission or enhancement spectrum is to a target spectrum
- Look into Caffe machine learning

**June 8 [#june8]

- Wrote MATLAB program to construct transfer matrix for n-th Fibonacci lattice
- Plotted transmission probability as a function of frequency, 10th gen made a nice looking fractal

To do:
- Clean up MATLAB code
- Plot E field in Fibonacci lattice as function of distance (z), see points of most enhancement

**June 7 [#june7]

- Walked to campus, took around 30 mins
- Found explicit formula for n-th Fibonacci number, and wrote Python program to generate n-th iteration of the Fibonacci fractal
- Had lunch with Hasdeo-san, bought lunch by weight from cafeteria (entrees 1.4 yen/g, rice .43 yen/g)
- Derived matching (boundary) and propagation matrices in transfer matrix method, used to calculate R and T probabilities for comparison
- Derived E and H relations from Maxwell's equations (Ampere's)
- Introduced myself at group meeting, shared some omiyage from China
- Set up printer over LAN, had to install driver manually

To do:
- Check that theory agrees with Snell's Law
- Construct transfer matrix for n-th Fibonacci lattice

**June 6 [#june6]
- Shoufie-san picked me up from Urban Castle Kawauchi, took me to campus by subway (International Center -> Aobayama, 200 yen ~20 mins total)
- Nugraha-sensei helped me set up lab server access, mail client, etc.
- Bento lunch from Espace Ouvert restaurant (next to 7-11 on bottom floor of new building), ate with Saito-sensei who played his ukulele
- Learned how to derive boundary conditions for EM wave passing from one dielectric media to another, using Maxwell's equations (Faraday's and Ampere's)
- Solved for reflection and transmission probabilities at boundary for normal incident wave

To do:

- Test SHH from dorm room
- Read pages on reflection and transmission of EM waves at boundary, and derive probabilities for both TE and TM both of oblique incident waves

Front page   New Page list Search Recent changes   Help   RSS of recent changes