User Tools

Site Tools


howto:pao-ml

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
howto:pao-ml [2018/07/13 15:04]
oschuett
howto:pao-ml [2018/10/08 22:05] (current)
oschuett
Line 2: Line 2:
  
 PAO-ML stands for Polarized Atomic Orbitals from Machine Learning. It uses machine learning to generate geometry adopted small basis sets. It also provides exact ionic forces. The scheme can serve as an almost drop-in replacement for conventional PAO-ML stands for Polarized Atomic Orbitals from Machine Learning. It uses machine learning to generate geometry adopted small basis sets. It also provides exact ionic forces. The scheme can serve as an almost drop-in replacement for conventional
-basis sets to speedup otherwise standard DFT calculations. The method is similar to semi-empirical models based on minimal basis sets, but offers improved accuracy and quasi-automatic parameterization. However, the method is still in an early stage - so use with caution. For more information see: [[doi>​10.3929/ethz-a-010819495]].+basis sets to speedup otherwise standard DFT calculations. The method is similar to semi-empirical models based on minimal basis sets, but offers improved accuracy and quasi-automatic parameterization. However, the method is still in an early stage - so use with caution. For more information see: [[doi>​10.1021/acs.jctc.8b00378]].
  
 ===== Step 1: Obtain training structures ===== ===== Step 1: Obtain training structures =====
Line 78: Line 78:
 In order to obtain good results from the learning machinery a small number of so-called [[https://​en.wikipedia.org/​wiki/​Hyperparameter | hyperparameters]] have to be carefully tuned for each application. For the current implementation this includes the [[inp>​FORCE_EVAL/​DFT/​LS_SCF/​PAO/​MACHINE_LEARNING#​GP_SCALE| GP_SCALE]] and the descriptor'​s [[inp>​FORCE_EVAL/​SUBSYS/​KIND/​PAO_DESCRIPTOR#​BETA | BETA ]] and [[inp>​FORCE_EVAL/​SUBSYS/​KIND/​PAO_DESCRIPTOR#​SCREENING | SCREENING]]. In order to obtain good results from the learning machinery a small number of so-called [[https://​en.wikipedia.org/​wiki/​Hyperparameter | hyperparameters]] have to be carefully tuned for each application. For the current implementation this includes the [[inp>​FORCE_EVAL/​DFT/​LS_SCF/​PAO/​MACHINE_LEARNING#​GP_SCALE| GP_SCALE]] and the descriptor'​s [[inp>​FORCE_EVAL/​SUBSYS/​KIND/​PAO_DESCRIPTOR#​BETA | BETA ]] and [[inp>​FORCE_EVAL/​SUBSYS/​KIND/​PAO_DESCRIPTOR#​SCREENING | SCREENING]].
  
-For the optimization of the hyper-parameter exists no gradient, hence one has to use a derivative-free method like the one by [[https://​en.wikipedia.org/​wiki/​Powell%27s_method| Powell]]. A versatile implementation is e.g. the [[src>cp2k/tools/​scriptmini| scriptmini ]] tool. A good optimization criterion is the variance of the energy difference wrt. the primary basis across the training set. Alternatively,​ atomic forces could be compared. Despite the missing gradients, this optimization is rather quick because it only performs calculations in the small PAO basis set.+For the optimization of the hyper-parameter exists no gradient, hence one has to use a derivative-free method like the one by [[https://​en.wikipedia.org/​wiki/​Powell%27s_method| Powell]]. A versatile implementation is e.g. the [[src>​tools/​scriptmini | scriptmini ]] tool. A good optimization criterion is the variance of the energy difference wrt. the primary basis across the training set. Alternatively,​ atomic forces could be compared. Despite the missing gradients, this optimization is rather quick because it only performs calculations in the small PAO basis set.
  
 ===== Step 5: Run simulation with PAO-ML ==== ===== Step 5: Run simulation with PAO-ML ====
howto/pao-ml.1531487059.txt.gz ยท Last modified: 2018/07/13 15:04 by oschuett