vit-base-patch16-224-in21k-bridgedefectVIT
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.1799
- Accuracy: {'accuracy': 0.9705510388437217}
- F1: {'f1': 0.9705092081728205}
- Precision: {'precision': 0.9710523804561741}
- Recall: {'recall': 0.9704181656558507}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
0.37 | 1.0 | 8302 | 0.3462 | {'accuracy': 0.8933453778982234} | {'f1': 0.8942100052466936} | {'precision': 0.8984250247518094} | {'recall': 0.8931370564158605} |
0.2375 | 2.0 | 16605 | 0.3353 | {'accuracy': 0.9053297199638664} | {'f1': 0.9062005892826234} | {'precision': 0.912717242831991} | {'recall': 0.9052684275828231} |
0.5678 | 3.0 | 24907 | 0.3114 | {'accuracy': 0.9118940078289671} | {'f1': 0.9116597109413729} | {'precision': 0.9165908158739848} | {'recall': 0.9116030141797212} |
0.09 | 4.0 | 33210 | 0.2768 | {'accuracy': 0.9270099367660344} | {'f1': 0.9272025877193879} | {'precision': 0.9305221603080029} | {'recall': 0.9267551810236085} |
0.266 | 5.0 | 41512 | 0.2595 | {'accuracy': 0.9312857573020175} | {'f1': 0.9313123811138734} | {'precision': 0.9327488749607135} | {'recall': 0.931043574955592} |
0.2037 | 6.0 | 49815 | 0.2123 | {'accuracy': 0.9431496537187594} | {'f1': 0.9428749572352995} | {'precision': 0.9435965528419799} | {'recall': 0.9429052318485974} |
0.1487 | 7.0 | 58117 | 0.2282 | {'accuracy': 0.9430292080698585} | {'f1': 0.9430188942480495} | {'precision': 0.9444609819488103} | {'recall': 0.9428880066548226} |
0.1405 | 8.0 | 66420 | 0.2440 | {'accuracy': 0.9454381210478772} | {'f1': 0.9455191951029847} | {'precision': 0.9467893516678145} | {'recall': 0.9453224042508239} |
0.09 | 9.0 | 74722 | 0.2480 | {'accuracy': 0.9436314363143632} | {'f1': 0.9433683232067358} | {'precision': 0.9452971145459653} | {'recall': 0.9433746555197686} |
0.2275 | 10.0 | 83025 | 0.2473 | {'accuracy': 0.946582354712436} | {'f1': 0.9462472081330006} | {'precision': 0.9479482237973264} | {'recall': 0.9463251646491099} |
0.0114 | 11.0 | 91327 | 0.1953 | {'accuracy': 0.9551942186088528} | {'f1': 0.954959353992539} | {'precision': 0.9555671952457011} | {'recall': 0.9550120730050532} |
0.0778 | 12.0 | 99630 | 0.2246 | {'accuracy': 0.948509485094851} | {'f1': 0.9485863094568601} | {'precision': 0.9496017185087666} | {'recall': 0.9484435235390778} |
0.1031 | 13.0 | 107932 | 0.2435 | {'accuracy': 0.9443541102077687} | {'f1': 0.9443461050911817} | {'precision': 0.9453218450441414} | {'recall': 0.9442028500529185} |
0.1419 | 14.0 | 116235 | 0.1751 | {'accuracy': 0.9580849141824752} | {'f1': 0.9580811670883926} | {'precision': 0.9586631550970829} | {'recall': 0.9580178560027687} |
0.0993 | 15.0 | 124537 | 0.2099 | {'accuracy': 0.9542908762420957} | {'f1': 0.9541061721417268} | {'precision': 0.9541191566948424} | {'recall': 0.9541611121516007} |
0.0696 | 16.0 | 132840 | 0.2240 | {'accuracy': 0.955736224028907} | {'f1': 0.9555782982813351} | {'precision': 0.9563626555520048} | {'recall': 0.9555607789866469} |
0.1697 | 17.0 | 141142 | 0.1904 | {'accuracy': 0.9579644685335742} | {'f1': 0.9577653922157884} | {'precision': 0.9581933285912818} | {'recall': 0.9578259452834421} |
0.0429 | 18.0 | 149445 | 0.2102 | {'accuracy': 0.9558566696778079} | {'f1': 0.955829019244906} | {'precision': 0.9570787144559411} | {'recall': 0.955662074541215} |
0.0062 | 19.0 | 157747 | 0.1768 | {'accuracy': 0.9601927130382415} | {'f1': 0.9601350969183112} | {'precision': 0.9605649770988711} | {'recall': 0.960090994011799} |
0.005 | 20.0 | 166050 | 0.1779 | {'accuracy': 0.9624209575429088} | {'f1': 0.9622479573311764} | {'precision': 0.9626782993390144} | {'recall': 0.9622658509657924} |
0.1395 | 21.0 | 174352 | 0.1801 | {'accuracy': 0.961035832580548} | {'f1': 0.9609739947935761} | {'precision': 0.9615134912739316} | {'recall': 0.9609000684385473} |
0.0966 | 22.0 | 182655 | 0.1854 | {'accuracy': 0.9594098163203855} | {'f1': 0.959384693086552} | {'precision': 0.9602665108685822} | {'recall': 0.9592591268355116} |
0.0077 | 23.0 | 190957 | 0.2190 | {'accuracy': 0.9573020174646191} | {'f1': 0.9572877808970253} | {'precision': 0.9580176848865115} | {'recall': 0.9571782999468976} |
0.1032 | 24.0 | 199260 | 0.2281 | {'accuracy': 0.9570009033423668} | {'f1': 0.9568818981129438} | {'precision': 0.9577859752909083} | {'recall': 0.95679636210611} |
0.1106 | 25.0 | 207562 | 0.2017 | {'accuracy': 0.9615778380006023} | {'f1': 0.9615258017857322} | {'precision': 0.9623198062794668} | {'recall': 0.9614196936259853} |
0.0833 | 26.0 | 215865 | 0.2074 | {'accuracy': 0.9618789521228546} | {'f1': 0.9618001985746503} | {'precision': 0.9625802607483476} | {'recall': 0.9617264541173526} |
0.0257 | 27.0 | 224167 | 0.1716 | {'accuracy': 0.9648900933453779} | {'f1': 0.9648046336171575} | {'precision': 0.9653533590655595} | {'recall': 0.9648070647916974} |
0.002 | 28.0 | 232470 | 0.2144 | {'accuracy': 0.9635049683830171} | {'f1': 0.9634863498105041} | {'precision': 0.9646616314066687} | {'recall': 0.9633283402670114} |
0.016 | 29.0 | 240772 | 0.2237 | {'accuracy': 0.959349593495935} | {'f1': 0.9594342688149864} | {'precision': 0.9608554784443832} | {'recall': 0.9591930193477335} |
0.0575 | 30.0 | 249075 | 0.1847 | {'accuracy': 0.9651912074676302} | {'f1': 0.9652324025756626} | {'precision': 0.9661899074568192} | {'recall': 0.9650558808909672} |
0.0997 | 31.0 | 257377 | 0.1798 | {'accuracy': 0.9686841312857573} | {'f1': 0.9686428828918746} | {'precision': 0.9691104091550086} | {'recall': 0.9685623791125} |
0.0017 | 32.0 | 265680 | 0.1985 | {'accuracy': 0.9627822944896116} | {'f1': 0.9626870784433683} | {'precision': 0.963172343077798} | {'recall': 0.962659195203449} |
0.0538 | 33.0 | 273982 | 0.1605 | {'accuracy': 0.9710328214393255} | {'f1': 0.9710267090566379} | {'precision': 0.9715030346291925} | {'recall': 0.9709339306149106} |
0.0023 | 34.0 | 282285 | 0.1832 | {'accuracy': 0.9674194519722975} | {'f1': 0.9673811237591747} | {'precision': 0.9679330625290327} | {'recall': 0.9672934059576415} |
0.0459 | 35.0 | 290587 | 0.1877 | {'accuracy': 0.9657332128876844} | {'f1': 0.965749942670487} | {'precision': 0.9664774134203846} | {'recall': 0.9656335047526519} |
0.0193 | 36.0 | 298890 | 0.1633 | {'accuracy': 0.9677205660945498} | {'f1': 0.9677329659674949} | {'precision': 0.9684419822552822} | {'recall': 0.9675975315398574} |
0.0707 | 37.0 | 307192 | 0.1787 | {'accuracy': 0.9685636856368564} | {'f1': 0.9684895304986225} | {'precision': 0.9689001010469502} | {'recall': 0.9684451099576021} |
0.0985 | 38.0 | 315495 | 0.2076 | {'accuracy': 0.9629629629629629} | {'f1': 0.9630524772042474} | {'precision': 0.9642571257654206} | {'recall': 0.9628345133405821} |
0.0788 | 39.0 | 323797 | 0.1794 | {'accuracy': 0.9702499247214694} | {'f1': 0.9701536210820301} | {'precision': 0.9706833500680011} | {'recall': 0.9700913059580385} |
0.0008 | 40.0 | 332100 | 0.1618 | {'accuracy': 0.9733212887684433} | {'f1': 0.9732738808256685} | {'precision': 0.9736678524998652} | {'recall': 0.9731998786471756} |
0.074 | 41.0 | 340402 | 0.1991 | {'accuracy': 0.9668172237277929} | {'f1': 0.9666853676025186} | {'precision': 0.9673504006462602} | {'recall': 0.9666339730453138} |
0.028 | 42.0 | 348705 | 0.1556 | {'accuracy': 0.9742246311352002} | {'f1': 0.9741506224327396} | {'precision': 0.9743929114728255} | {'recall': 0.9741060958660924} |
0.1092 | 43.0 | 357007 | 0.1567 | {'accuracy': 0.9740439626618489} | {'f1': 0.9739721593463402} | {'precision': 0.9742787951493688} | {'recall': 0.9739217266482031} |
0.0008 | 44.0 | 365310 | 0.1697 | {'accuracy': 0.9707919301415237} | {'f1': 0.9707068184898958} | {'precision': 0.9712158191257935} | {'recall': 0.9706396165347172} |
0.1728 | 45.0 | 373612 | 0.1791 | {'accuracy': 0.9701294790725685} | {'f1': 0.9700180755443455} | {'precision': 0.9704271475318083} | {'recall': 0.9699790872810246} |
0.0004 | 46.0 | 381915 | 0.2024 | {'accuracy': 0.9672387834989461} | {'f1': 0.9672031338307139} | {'precision': 0.9680962843155184} | {'recall': 0.9670672659468575} |
0.0044 | 47.0 | 390217 | 0.1708 | {'accuracy': 0.9721168322794339} | {'f1': 0.9720140881144397} | {'precision': 0.9723799188733908} | {'recall': 0.9719693947081535} |
0.089 | 48.0 | 398520 | 0.1975 | {'accuracy': 0.9686841312857573} | {'f1': 0.9686510789801565} | {'precision': 0.969349692339074} | {'recall': 0.9685439142771983} |
0.0774 | 49.0 | 406822 | 0.1778 | {'accuracy': 0.9709123757904246} | {'f1': 0.9708794409655027} | {'precision': 0.9714408230271825} | {'recall': 0.9707829629677185} |
0.0012 | 50.0 | 415100 | 0.1799 | {'accuracy': 0.9705510388437217} | {'f1': 0.9705092081728205} | {'precision': 0.9710523804561741} | {'recall': 0.9704181656558507} |
Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for mmomm25/vit-base-patch16-224-in21k-bridgedefectVIT
Base model
google/vit-base-patch16-224-in21kEvaluation results
- Accuracy on imagefolderself-reported[object Object]
- F1 on imagefolderself-reported[object Object]
- Precision on imagefolderself-reported[object Object]
- Recall on imagefolderself-reported[object Object]