Trainer

Trainer#

This example shows how to use the Trainer class to train the potential.

import logging

import ase.io
from ase.visualize.plot import plot_atoms

from motep.io.mlip.mtp import read_mtp, write_mtp
from motep.train import Trainer

If you need a log, you can set the following.

logging.basicConfig(level=logging.INFO, format="%(message)s")

We first load the atomic configuration to compute.

images = ase.io.read("../0.0_train/ase.xyz", index=":")
ax = plot_atoms(images[0])
0.train

We next load the initial (likely untrained) potential.

mtp_data = read_mtp("initial.mtp")
mtp_data.species = [6, 1]
mtp_data
MTPData(version='1.1.0', potential_name='MTP1m', scaling=1.0, species_count=2, potential_tag='', radial_basis_type='RBChebyshev', min_dist=np.float64(0.5), max_dist=np.float64(5.0), radial_funcs_count=np.int32(1), radial_basis_size=np.int32(8), radial_coeffs=None, alpha_moments_count=np.int32(1), alpha_index_basic_count=np.int32(1), alpha_index_basic=array([[0, 0, 0, 0]], dtype=int32), alpha_index_times_count=np.int32(0), alpha_index_times=array([], shape=(0, 4), dtype=int32), alpha_scalar_moments=np.int32(1), alpha_moment_mapping=array([0], dtype=int32), species_coeffs=None, moment_coeffs=None, _species=array([6, 1], dtype=int32), optimized=['species_coeffs', 'moment_coeffs', 'radial_coeffs'])

We then make the Trainer class and train the potential by the images.

trainer = Trainer(mtp_data, seed=42)
loss = trainer.train(images)
[random seed] = 42
========================================================================

{'method': 'minimize'}

loss 0: 1011.5020145280739
loss 1: 23.96611618212598
loss 2: 16.080950300447974
loss 3: 6.311941371024075
loss 4: 3.1615359066847706
loss 5: 0.3927903836126466
loss 6: 0.12596640778521542
loss 7: 0.11544503492176875
loss 8: 0.11307700359591366
loss 9: 0.10964666248733693
loss 10: 0.09822874264626325
loss 11: 0.07973445127246574
loss 12: 0.05727215676231659
loss 13: 0.05666844956831864
loss 14: 0.04756055447339477
loss 15: 0.04595252718896887
loss 16: 0.04546252660302083
loss 17: 0.044461306114880784
loss 18: 0.04148174758175779
loss 19: 0.03539303550156289
loss 20: 0.025595599217162604
loss 21: 0.024678603917023006
loss 22: 0.017362626506187123
loss 23: 0.014945724946538525
loss 24: 0.01468375758702544
loss 25: 0.014673688242868885
loss 26: 0.01464803637032551
loss 27: 0.014576868624252708
loss 28: 0.013887005813564671
loss 29: 0.012861413722348993
loss 30: 0.010702510270039496
loss 31: 0.007803168669166078
loss 32: 0.007441204000235861
loss 33: 0.006749507598030517
loss 34: 0.005139895349442437
loss 35: 0.004919058451493408
loss 36: 0.004834520619271784
loss 37: 0.004832245750040917
loss 38: 0.004829629732790496
loss 39: 0.0048252897555635405
loss 40: 0.004813593974951447
loss 41: 0.004799566962374252
loss 42: 0.004711896388232165
loss 43: 0.004547632339802547
loss 44: 0.0041248465645066824
loss 45: 0.0034454664434983686
loss 46: 0.003035505387868393
loss 47: 0.0022631855121557267
loss 48: 0.0021319882000711693
loss 49: 0.0017968069175545436
loss 50: 0.0017746052125098095
loss 51: 0.0017563117677812264
loss 52: 0.0017454259077508431
loss 53: 0.0017437819048056376
loss 54: 0.0017425882527997813
loss 55: 0.0017388557964254146
loss 56: 0.001730723965053897
loss 57: 0.0017127256809486996
loss 58: 0.0017030754088078039
loss 59: 0.0016801355094290009
loss 60: 0.0016681681204539821
loss 61: 0.0016521008270236736
loss 62: 0.0016467901761826823
loss 63: 0.0016399961641262377
loss 64: 0.00163404462457076
loss 65: 0.0016295722622563838
loss 66: 0.0016153413803295306
loss 67: 0.0015953267606546456
loss 68: 0.0015785239840638372
loss 69: 0.0015405537747389298
loss 70: 0.0015073460786361199
loss 71: 0.0014758996645563413
loss 72: 0.001457407729000278
loss 73: 0.0014445428190844982
loss 74: 0.00143447374324636
loss 75: 0.0014277500276575052
loss 76: 0.0014195423853907564
loss 77: 0.0014151043281703172
loss 78: 0.0014119788131766718
loss 79: 0.001408757969024091
loss 80: 0.0014036780546491808
loss 81: 0.001394144484060693
loss 82: 0.0013889141958905863
loss 83: 0.0013756497100150533
loss 84: 0.001362426244072268
loss 85: 0.0013457665266839638
loss 86: 0.0013293138012343835
loss 87: 0.0013165302463423111
loss 88: 0.0012756653545766663
loss 89: 0.00126196316569394
loss 90: 0.0012600962194512777
loss 91: 0.00125791220555602
loss 92: 0.0012572848197164976
loss 93: 0.0012564275197188405
loss 94: 0.0012560359667935404
loss 95: 0.0012551293211101016
loss 96: 0.0012545825064952572
loss 97: 0.0012542336634080325
loss 98: 0.0012539534671664894
loss 99: 0.0012534628808149552
loss 100: 0.0012531022190826434
loss 101: 0.0012527220004856648
loss 102: 0.0012523074204189703
loss 103: 0.0012512518271273524
loss 104: 0.001250360451724703
loss 105: 0.0012498421823189578
loss 106: 0.0012483635245943594
loss 107: 0.0012476048362638178
loss 108: 0.0012455136214340695
loss 109: 0.0012386568600051013
loss 110: 0.0012343403675948222
loss 111: 0.001221055942356396
loss 112: 0.001199885900159287
loss 113: 0.0011799302687385024
loss 114: 0.001172259521731017
loss 115: 0.0011693523619329578
loss 116: 0.0011680635275380805
loss 117: 0.0011677092694508963
loss 118: 0.0011669657381787213
loss 119: 0.0011665515116336513
loss 120: 0.0011642933834376753
loss 121: 0.0011614878330391632
loss 122: 0.001156407615635544
loss 123: 0.001154556685045722
loss 124: 0.0011532598093869934
loss 125: 0.0011509861191007313
loss 126: 0.001150592841988925
loss 127: 0.0011505154542606093
loss 128: 0.0011503649809775612
loss 129: 0.0011502400191887262
loss 130: 0.0011499441142949378
loss 131: 0.0011496159663270528
loss 132: 0.0011493617990995224
loss 133: 0.001148513361191225
loss 134: 0.0011479985208449044
loss 135: 0.0011458483614311705
loss 136: 0.001145483184138674
loss 137: 0.0011452733838311585
loss 138: 0.0011450840448929682
loss 139: 0.0011448492522203332
loss 140: 0.0011447004414282664
loss 141: 0.001144461956700868
loss 142: 0.0011437661340644209
loss 143: 0.0011434271149756135
loss 144: 0.0011429995694333967
loss 145: 0.0011419766635173763
loss 146: 0.0011409078614182991
loss 147: 0.0011407646503027037
loss 148: 0.001138448348256507
loss 149: 0.0011363514721875522
loss 150: 0.0011326181644401446
loss 151: 0.0011296800532558117
loss 152: 0.001127060907758005
loss 153: 0.001119924727302231
loss 154: 0.0011087092965063347
loss 155: 0.0010955246856034664
loss 156: 0.0010864416122527841
loss 157: 0.0010811551027078902
loss 158: 0.0010792131315267403
loss 159: 0.001072665208363245
loss 160: 0.001067270705921997
loss 161: 0.0010611669665820576
loss 162: 0.0010539955120043639
loss 163: 0.0010501540082295863
loss 164: 0.001049837207824968
loss 165: 0.0010477474528098243
loss 166: 0.0010463191802332365
loss 167: 0.0010398683459868989
loss 168: 0.0010326350208215412
loss 169: 0.0010149265225792162
loss 170: 0.0009964870356863231
loss 171: 0.0009744578482370498
loss 172: 0.0009696087825559244
loss 173: 0.0009548325008856472
loss 174: 0.000950612721975158
loss 175: 0.0009472335422792097
loss 176: 0.0009447850500780092
loss 177: 0.00094339561443998
loss 178: 0.0009422380485584323
loss 179: 0.0009412802722040563
loss 180: 0.0009317462155928215
loss 181: 0.0009219149112446026
loss 182: 0.0009182898086082976
loss 183: 0.0009163635109174102
loss 184: 0.0009157859034844845
loss 185: 0.0009155391541576201
loss 186: 0.0009143816159129316
loss 187: 0.0009116270922163359
loss 188: 0.0009108820724559396
loss 189: 0.0009088711600071265
loss 190: 0.0009070383661078283
loss 191: 0.0009062677812595162
loss 192: 0.0009061549674272306
loss 193: 0.0009058573974838889
loss 194: 0.0009058368324650854
loss 195: 0.0009058286453503459
loss 196: 0.0009058193742616533
loss 197: 0.0009057366344086813
loss 198: 0.000905597237181046
loss 199: 0.0009053148517424297
loss 200: 0.0009052704889126342
loss 201: 0.0009047840242193954
loss 202: 0.0009043413891123189
loss 203: 0.0009031034507928523
loss 204: 0.000901651640850023
loss 205: 0.0009013582262280417
loss 206: 0.0009006870595514077
loss 207: 0.0008990894547448717
loss 208: 0.0008948346146843832
loss 209: 0.0008911615505575575
loss 210: 0.0008876659983902093
loss 211: 0.0008851009903798081
loss 212: 0.0008791937578730217
loss 213: 0.0008417597069616386
loss 214: 0.0008133683462425185
loss 215: 0.0007811950716624561
loss 216: 0.0007781490239017015
loss 217: 0.0007720024178282057
loss 218: 0.000771143273172068
loss 219: 0.0007706407183108783
loss 220: 0.0007700974617974364
loss 221: 0.0007687861746840662
loss 222: 0.0007672742430445804
loss 223: 0.000766118790809986
loss 224: 0.0007659058811582857
loss 225: 0.0007651117437803772
loss 226: 0.0007642893872631118
loss 227: 0.0007633992387788206
loss 228: 0.0007624981250166751
loss 229: 0.0007607192601481327
loss 230: 0.0007584794621707376
loss 231: 0.0007539269944568303
loss 232: 0.0007512126432447726
loss 233: 0.000746053584706713
loss 234: 0.0007426467200761671
loss 235: 0.0007418704204017878
loss 236: 0.0007391654040023784
loss 237: 0.0007358107613069078
loss 238: 0.0007330327724355799
loss 239: 0.0007221535986723176
loss 240: 0.0007174702620869899
loss 241: 0.0007040509190348944
loss 242: 0.0006700150490461724
loss 243: 0.0006308738256813288
loss 244: 0.0005949638190601592
loss 245: 0.0005844738831000169
loss 246: 0.0005821523873104343
loss 247: 0.0005731189812132264
loss 248: 0.0005693207989674351
loss 249: 0.0005684502158810269
loss 250: 0.0005679113545187855
loss 251: 0.0005664092511140131
loss 252: 0.0005656561093062393
loss 253: 0.0005642813595324458
loss 254: 0.0005629056569284672
loss 255: 0.0005604238372635319
loss 256: 0.0005590442565443682
loss 257: 0.0005586284171078384
loss 258: 0.0005578014974343333
loss 259: 0.0005567426752726936
loss 260: 0.0005544870186096714
loss 261: 0.0005521260309366934
loss 262: 0.0005492113843466574
loss 263: 0.0005456669851111253
loss 264: 0.0005413548244392902
loss 265: 0.0005355791866403625
loss 266: 0.0005335829796360046
loss 267: 0.0005306276311165146
loss 268: 0.0005297853131164526
loss 269: 0.0005296580427195402
loss 270: 0.0005292294251397462
loss 271: 0.0005286961231866015
loss 272: 0.0005277647391984277
loss 273: 0.0005270466485345213
loss 274: 0.0005266847943755771
loss 275: 0.0005261881442390596
loss 276: 0.0005256034641188363
loss 277: 0.000523483223648686
loss 278: 0.0005214062984189513
loss 279: 0.0005185346481607396
loss 280: 0.0005161673772056849
loss 281: 0.0005148663704588425
loss 282: 0.0005128790284233363
loss 283: 0.0005112650408625921
loss 284: 0.0005079692834586818
loss 285: 0.0005026841709246698
loss 286: 0.0004973254711395646
loss 287: 0.0004913816217679016
loss 288: 0.0004879239938165274
loss 289: 0.00048601732345720844
loss 290: 0.0004855642868932147
loss 291: 0.0004847836581792462
loss 292: 0.0004846477215241099
loss 293: 0.00048434490119979404
loss 294: 0.0004841333856016128
loss 295: 0.0004839016429672282
loss 296: 0.0004836869201340028
loss 297: 0.000483395127069036
loss 298: 0.000483364649513102
loss 299: 0.00048334225378433436
loss 300: 0.00048331963259775284
loss 301: 0.0004832967323218868
loss 302: 0.00048324096074176903
loss 303: 0.00048311233088483305
loss 304: 0.0004829399841837637
loss 305: 0.0004820717824849233
loss 306: 0.0004813074689446983
loss 307: 0.0004803609356881207
loss 308: 0.0004796621023615301
loss 309: 0.0004792076653625697
loss 310: 0.00047881641347923795
loss 311: 0.000478450112903969
loss 312: 0.00047789773909124206
loss 313: 0.0004777484685644238
loss 314: 0.00047743087361619587
loss 315: 0.000477360497520474
loss 316: 0.00047725740471596384
loss 317: 0.0004770969755224854
loss 318: 0.00047681548716242906
loss 319: 0.0004764443250127081
loss 320: 0.00047639928784891367
loss 321: 0.0004761282542865912
loss 322: 0.0004760388435486997
loss 323: 0.0004757749515342071
loss 324: 0.00047572037780970703
loss 325: 0.0004757019302157532
loss 326: 0.0004756817968409591
loss 327: 0.0004756776513567508
loss 328: 0.00047567488224567635
loss 329: 0.00047567070923294747
loss 330: 0.00047566560837208167
loss 331: 0.0004756609533238624
loss 332: 0.0004756477835372259
loss 333: 0.0004756245898312771
loss 334: 0.00047560623381918927
loss 335: 0.0004755436862943388
loss 336: 0.00047536037727025055
loss 337: 0.00047503121358414223
loss 338: 0.00047469535531872845
loss 339: 0.00047402218843175484
loss 340: 0.00047345192574574303
loss 341: 0.0004733602424070742
loss 342: 0.00047333266994813524
loss 343: 0.00047327350452914987
loss 344: 0.0004732006605899786
loss 345: 0.00047319350248193216
loss 346: 0.0004731542486199745
loss 347: 0.00047310535752564145
loss 348: 0.00047309280114168537
loss 349: 0.0004730359346951518
loss 350: 0.00047299593455276306
loss 351: 0.0004729617666501113
loss 352: 0.00047286625130351804
loss 353: 0.0004727994956997514
loss 354: 0.00047255393945875104
loss 355: 0.00047211053000089054
loss 356: 0.00047134428235035155
loss 357: 0.00047042657884532944
loss 358: 0.0004701542872550036
loss 359: 0.0004689220591792784
loss 360: 0.00046851714207402344
loss 361: 0.00046845656218023093
loss 362: 0.00046839882750068757
loss 363: 0.0004683830379887747
loss 364: 0.00046837569376749206
loss 365: 0.00046837234807392184
loss 366: 0.00046836371225484485
loss 367: 0.0004683604150810808
loss 368: 0.00046835317451884937
loss 369: 0.0004683367522931582
loss 370: 0.00046830841121406374
loss 371: 0.0004682520640162707
loss 372: 0.00046822411876337803
loss 373: 0.0004681606945602687
loss 374: 0.0004681233765467535
loss 375: 0.00046811076126582684
loss 376: 0.0004680922536044756
loss 377: 0.00046806192087557965
loss 378: 0.00046799020374741786
loss 379: 0.00046786333318358077
loss 380: 0.00046777902987127357
loss 381: 0.00046762523134095263
loss 382: 0.00046742794034640103
loss 383: 0.0004673817329130546
loss 384: 0.0004673424657229836
loss 385: 0.0004672310672533911
loss 386: 0.0004672085373841778
loss 387: 0.0004671936749889559
loss 388: 0.0004671850574222287
loss 389: 0.0004671615955909438
loss 390: 0.0004671290515545452
loss 391: 0.0004670842024409672
loss 392: 0.0004670680032683845
loss 393: 0.00046701107093670397
loss 394: 0.0004669405243806843
loss 395: 0.0004668803835447808
loss 396: 0.000466783828379801
loss 397: 0.00046676804411298795
loss 398: 0.0004666513251051497
loss 399: 0.00046652704626219855
loss 400: 0.00046647766038303913
loss 401: 0.0004664560842599957
loss 402: 0.0004664222858037637
loss 403: 0.00046641054643913206
loss 404: 0.0004663802310615034
loss 405: 0.00046631822175477206
loss 406: 0.00046629660826313224
loss 407: 0.000466223176457652
loss 408: 0.00046619662301589836
loss 409: 0.000466093420694223
loss 410: 0.0004660151834658055
loss 411: 0.0004658866335222421
loss 412: 0.00046576932834290066
loss 413: 0.00046570185954225406
loss 414: 0.0004655907842832865
loss 415: 0.0004654873313294114
loss 416: 0.0004654412289414573
loss 417: 0.00046542376487170364
loss 418: 0.00046542153411889954
loss 419: 0.0004654206130109406

Optimization result:
  Message: CONVERGENCE: RELATIVE REDUCTION OF F <= FACTR*EPSMCH
  Success: True
  Status code: 0
  Number of function evaluations: 473
  Number of iterations: 419

Energy (eV):
    Errors checked for 101 configurations
    MAX error: 0.054722519962922433
    ABS error: 0.010380311289190506
    RMS error: 0.013925183935552312

Energy per atom (eV/atom):
    Errors checked for 101 configurations
    MAX error: 0.006840314995365304
    ABS error: 0.0012975389111488133
    RMS error: 0.001740647991944039

Forces per component (eV/angstrom):
    Errors checked for 808 atoms
    MAX error: 0.6297023811568421
    ABS error: 0.09364386499870943
    RMS error: 0.12414920560614175

Stress per component (GPa):
    Errors checked for 0 configurations
    MAX error: nan
    ABS error: nan
    RMS error: nan

Time (step 0: minimize): 21.83408881599999 (s)

We can see the final loss function value.

loss(mtp_data.parameters)
np.float64(0.0004654206130109406)

After the training, the given mtp_data is updated in-place. We finally store the trained potential in a new file.

write_mtp("final.mtp", mtp_data)

Total running time of the script: (0 minutes 22.366 seconds)

Gallery generated by Sphinx-Gallery