LISTSERV mailing list manager LISTSERV 16.5

Help for AMP-USERS Archives


AMP-USERS Archives

AMP-USERS Archives


AMP-USERS@LISTSERV.BROWN.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

AMP-USERS Home

AMP-USERS Home

AMP-USERS  March 2018

AMP-USERS March 2018

Subject:

AMP Annealer and Basinhopping

From:

Anthony <[log in to unmask]>

Reply-To:

Amp Users List <[log in to unmask]>

Date:

Tue, 6 Mar 2018 10:57:22 -0500

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (65 lines)

As I understand it, the Annealer within AMP operate in this sequence:

1. Generate a random initial parameter vector X0
2. Calculate the value of the loss function with X0
3. Generate a trial move resulting in a new parameter vector X1
4. Calculate the value of the loss function with X1
5. Decide to accept or reject the new vector X1 based on the Metropolis criteria
6. If reject, continue with step 3  -  If accept, update X0=X1 and continue with step 3

So, this describes a Monte Carlo based search in the model parameter space.

Rather than perform the Metropolis test on the loss function resulting from the randomly generated X1, I would like to perform this test on an optimized set of parameters X1' that is obtained by minimizing Lossfunction(X1) with say a BFGS minimization.

I believe that this can be accomplished with the "basinhopping" algorithm using the Regressor class, as indicated in the "Adjusting convergence parameters" section of the AMP documentation.

Unfortunately, errors are generated when using this option:

train.py
=================================================
from ase.io import read,write
from ase.io.trajectory import Trajectory
from ase.calculators.singlepoint import SinglePointCalculator
from amp.model import LossFunction
import os
from amp import Amp
from amp.model.neuralnetwork import NeuralNetwork
from amp.descriptor.zernike import Zernike, FingerprintCalculator
from amp.regression import Regressor
from scipy.optimize import basinhopping

num_cores = 10

train_file = 'training.traj'

convergence={'energy_rmse':0.0005,'force_rmse':None}

calc=Amp(descriptor=Zernike(cutoff=10),model=NeuralNetwork(hiddenlayers=(8,8,8)),cores=num_cores)

regressor = Regressor(optimizer=basinhopping)
calc.model.regressor = regressor

calc.model.lossfunction=LossFunction(convergence=convergence)

calc.train(images=train_file)
====================================================

produces errors:

Traceback (most recent call last):
  File "training_bhop.py", line 42, in <module>
    calc.train(images=train_file)
  File "/Apps/software/anaconda2/lib/python2.7/site-packages/amp-dev-py2.7.egg/amp/__init__.py", line 311, in train
    parallel=self._parallel)
  File "/Apps/software/anaconda2/lib/python2.7/site-packages/amp-dev-py2.7.egg/amp/model/neuralnetwork.py", line 228, in fit
    result = self.regressor.regress(model=self, log=log)
  File "/Apps/software/anaconda2/lib/python2.7/site-packages/amp-dev-py2.7.egg/amp/regression/__init__.py", line 85, in regress
    **self.optimizer_kwargs)
  File "/Apps/software/anaconda2/lib/python2.7/site-packages/scipy/optimize/_basinhopping.py", line 632, in basinhopping
    niter_success = niter + 2
TypeError: unsupported operand type(s) for +: 'instancemethod' and 'int'


I don't believe there something wrong in the training.py script.  Can someone offer some insight into the problem?
Anthony

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017

ATOM RSS1 RSS2



LISTSERV.BROWN.EDU

CataList Email List Search Powered by the LISTSERV Email List Manager