ServerRun 14104
Creatorzenogantner
ProgramMyMediaLite-biased-matrix-factorization-k-40
Datasetcollaborativefiltering-sample
Task typeCollaborativeFiltering
Created5y299d ago
DownloadLogin required!
Done! Flag_green
7s
418M
CollaborativeFiltering
0.066
0.045
2.27
1.78

Log file

... (lines omitted) ...
loss 0.0833546233276025 learn_rate 0.656014293415656 
loss 0.0833078899486699 learn_rate 0.688815008086439 
loss 0.0832862197658992 learn_rate 0.723255758490761 
loss 0.0832728293833943 learn_rate 0.759418546415299 
loss 0.0832627311857737 learn_rate 0.797389473736064 
loss 0.0832546602836404 learn_rate 0.837258947422868 
loss 0.0832484970716857 learn_rate 0.879121894794011 
loss 0.083244443600928 learn_rate 0.923077989533712 
loss 0.0832427937604996 learn_rate 0.969231889010397 
loss 0.0832438823442588 learn_rate 0.484615944505199 
loss 0.0829996854128667 learn_rate 0.508846741730459 
loss 0.0829060523297148 learn_rate 0.534289078816982 
loss 0.0828640356215837 learn_rate 0.561003532757831 
loss 0.0828422029848646 learn_rate 0.589053709395722 
loss 0.0828291238751656 learn_rate 0.618506394865509 
loss 0.0828201608942615 learn_rate 0.649431714608784 
loss 0.0828134023760136 learn_rate 0.681903300339223 
loss 0.0828081451637945 learn_rate 0.715998465356184 
loss 0.0828042271925922 learn_rate 0.751798388623994 
loss 0.0828017203656978 learn_rate 0.789388308055193 
loss 0.082800796597685 learn_rate 0.828857723457953 
loss 0.0828016757223073 learn_rate 0.414428861728977 
loss 0.0826593596515177 learn_rate 0.435150304815425 
loss 0.0825934474142879 learn_rate 0.456907820056197 
loss 0.0825595654236517 learn_rate 0.479753211059007 
loss 0.0825403236223421 learn_rate 0.503740871611957 
loss 0.0825283761951839 learn_rate 0.528927915192555 
loss 0.0825203318357156 learn_rate 0.555374310952183 
loss 0.0825145370077006 learn_rate 0.583143026499792 
loss 0.0825101993251173 learn_rate 0.612300177824781 
loss 0.0825069856188823 learn_rate 0.642915186716021 
loss 0.08250481149619 learn_rate 0.675060946051822 
loss 0.0825037255871697 learn_rate 0.708813993354413 
loss 0.0825038477249802 learn_rate 0.354406996677206 
loss 0.0824195050463481 learn_rate 0.372127346511067 
loss 0.0823741339681404 learn_rate 0.39073371383662 
loss 0.0823479139516393 learn_rate 0.410270399528451 
loss 0.0823316315127116 learn_rate 0.430783919504874 
loss 0.0823208543348238 learn_rate 0.452323115480117 
loss 0.0823133260411038 learn_rate 0.474939271254123 
loss 0.0823078317904025 learn_rate 0.49868623481683 
loss 0.0823037019385898 learn_rate 0.523620546557671 
loss 0.082300581033093 learn_rate 0.549801573885555 
loss 0.082298306809722 learn_rate 0.577291652579832 
loss 0.0822968387236565 learn_rate 0.606156235208824 
loss 0.082296213070418 learn_rate 0.636464046969265 
loss 0.0822965150294168 learn_rate 0.318232023484633 
loss 0.0822385399044525 learn_rate 0.334143624658864 
loss 0.0822046910385201 learn_rate 0.350850805891807 
loss 0.0821838354905398 learn_rate 0.368393346186398 
loss 0.0821702556405521 learn_rate 0.386813013495718 
loss 0.0821609692880537 learn_rate 0.406153664170504 
loss 0.0821543643200523 learn_rate 0.426461347379029 
loss 0.0821495287372486 learn_rate 0.44778441474798 
loss 0.0821459295382172 learn_rate 0.470173635485379 
loss 0.0821432572451481 learn_rate 0.493682317259648 
loss 0.0821413456105001 learn_rate 0.518366433122631 
loss 0.082140125209916 learn_rate 0.544284754778762 
loss 0.0821395935430028 learn_rate 0.571498992517701 
loss 0.0821397947228613 learn_rate 0.28574949625885 
loss 0.0820995993225636 learn_rate 0.300036971071793 
loss 0.0820743816182291 learn_rate 0.315038819625382 
loss 0.0820578984435864 learn_rate 0.330790760606652 
loss 0.082046649938772 learn_rate 0.347330298636984 
loss 0.0820386698171586 learn_rate 0.364696813568833 
loss 0.0820328328137215 learn_rate 0.382931654247275 
loss 0.0820284744332371 learn_rate 0.402078236959639 
loss 0.082025189880765 learn_rate 0.422182148807621 
loss 0.0820227301198403 learn_rate 0.443291256248002 
loss 0.0820209475567871 learn_rate 0.465455819060402 
loss 0.0820197658012972 learn_rate 0.488728610013422 
loss 0.0820191608915699 learn_rate 0.513165040514093 
loss 0.0820191483452666 learn_rate 0.538823292539798 
loss 0.0820197737575339 learn_rate 0.269411646269899 
loss 0.081987596038868 learn_rate 0.282882228583394 
loss 0.0819667707626275 learn_rate 0.297026340012564 
loss 0.0819528353104332 learn_rate 0.311877657013192 
loss 0.0819431770104185 learn_rate 0.327471539863851 
loss 0.0819362713542851 learn_rate 0.343845116857044 
loss 0.081931219414293 learn_rate 0.361037372699896 
loss 0.0819274786503025 learn_rate 0.379089241334891 
loss 0.0819247126335507 learn_rate 0.398043703401636 
loss 0.0819227099177813 learn_rate 0.417945888571717 
loss 0.0819213407472046 learn_rate 0.438843183000303 
loss 0.0819205332323291 learn_rate 0.460785342150318 
loss 0.0819202590633791 learn_rate 0.483824609257834 
loss 0.0819205238951981 learn_rate 0.241912304628917 
loss 0.081897957668718 learn_rate 0.254007919860363 
loss 0.0818824567121621 learn_rate 0.266708315853381 
loss 0.081871529770254 learn_rate 0.28004373164605 
loss 0.0818636093790355 learn_rate 0.294045918228353 
loss 0.0818577199448781 learn_rate 0.30874821413977 
loss 0.0818532549834944 learn_rate 0.324185624846759 
loss 0.0818498338301342 learn_rate 0.340394906089097 
loss 0.0818472134845071 learn_rate 0.357414651393552 
loss 0.0818452367188365 learn_rate 0.375285383963229 
loss 0.0818438026666456 learn_rate 0.394049653161391 
loss 0.0818428504651661 learn_rate 0.41375213581946 
loss 0.0818423499608752 learn_rate 0.434439742610433 
loss 0.0818422959865627 learn_rate 0.456161729740955 
loss 0.0818427043840385 learn_rate 0.228080864870478 
loss 0.081824461025225 learn_rate 0.239484908114001 
loss 0.0818115965991242 learn_rate 0.251459153519702 
loss 0.0818023298577788 learn_rate 0.264032111195687 
loss 0.0817954992609022 learn_rate 0.277233716755471 
loss 0.0817903578662732 learn_rate 0.291095402593245 
loss 0.0817864283927058 learn_rate 0.305650172722907 
loss 0.0817834050197612 learn_rate 0.320932681359052 
loss 0.0817810897742397 learn_rate 0.336979315427005 
loss 0.0817793532033367 learn_rate 0.353828281198355 
loss 0.0817781111465914 learn_rate 0.371519695258273 
loss 0.0817773115161053 learn_rate 0.390095680021186 
loss 0.0817769268665774 learn_rate 0.409600464022246 
loss 0.0817769500594984 learn_rate 0.204800232011123 
loss 0.081764017630056 learn_rate 0.215040243611679 
loss 0.0817544335912492 learn_rate 0.225792255792263 
loss 0.0817472103099101 learn_rate 0.237081868581876 
loss 0.0817416642556312 learn_rate 0.24893596201097 
loss 0.0817373308686005 learn_rate 0.261382760111519 
loss 0.0817338987222354 learn_rate 0.274451898117094 
loss 0.0817311606746121 learn_rate 0.288174493022949 
loss 0.0817289792036102 learn_rate 0.302583217674097 
loss 0.0817272629638183 learn_rate 0.317712378557802 
loss 0.081725951736176 learn_rate 0.333597997485692 
loss 0.0817250072843588 learn_rate 0.350277897359976 
loss 0.0817244080951063 learn_rate 0.367791792227975 
loss 0.0817241464788726 learn_rate 0.386181381839374 
loss 0.0817242269755125 learn_rate 0.193090690919687 
loss 0.0817136937947484 learn_rate 0.202745225465671 
loss 0.0817057120264773 learn_rate 0.212882486738955 
loss 0.0816995780216925 learn_rate 0.223526611075903 
loss 0.081694790166274 learn_rate 0.234702941629698 
loss 0.0816909979229919 learn_rate 0.246438088711183 
loss 0.0816879604621244 learn_rate 0.258759993146742 
loss 0.0816855144457408 learn_rate 0.271697992804079 
loss 0.0816835500292442 learn_rate 0.285282892444283 
loss 0.0816819938241184 learn_rate 0.299547037066497 
loss 0.081680797427318 learn_rate 0.314524388919822 
loss 0.0816799301425403 learn_rate 0.330250608365813 
loss 0.0816793746582359 learn_rate 0.346763138784104 
loss 0.0816791246611571 learn_rate 0.364101295723309 
loss 0.0816791836081486 learn_rate 0.182050647861654 
loss 0.0816705647880842 learn_rate 0.191153180254737 
loss 0.0816638956093438 learn_rate 0.200710839267474 
loss 0.0816586740758881 learn_rate 0.210746381230848 
loss 0.0816545322844928 learn_rate 0.22128370029239 
loss 0.0816512063168907 learn_rate 0.23234788530701 
loss 0.0816485105657574 learn_rate 0.24396527957236 
loss 0.0816463167523588 learn_rate 0.256163543550978 
loss 0.0816445375259026 learn_rate 0.268971720728527 
loss 0.0816431142447712 learn_rate 0.282420306764953 
loss 0.0816420083443292 learn_rate 0.296541322103201 
loss 0.081641195597772 learn_rate 0.311368388208361 
loss 0.0816406625659289 learn_rate 0.326936807618779 
loss 0.0816404045904117 learn_rate 0.343283647999718 
loss 0.0816404247885258 learn_rate 0.171641823999859 
loss 0.0816333513736321 learn_rate 0.180223915199852 
loss 0.0816277686935206 learn_rate 0.189235110959845 
loss 0.0816233190237071 learn_rate 0.198696866507837 
loss 0.0816197333111921 learn_rate 0.208631709833229 
loss 0.0816168137333832 learn_rate 0.21906329532489 
loss 0.0816144180436992 learn_rate 0.230016460091135 
loss 0.0816124461327486 learn_rate 0.241517283095691 
loss 0.081610829004228 learn_rate 0.253593147250476 
loss 0.0816095201565984 learn_rate 0.266272804613 
loss 0.0816084891926843 learn_rate 0.27958644484365 
loss 0.0816077173604629 learn_rate 0.293565767085832 
loss 0.0816071946627599 learn_rate 0.308244055440124 
loss 0.081606918157745 learn_rate 0.32365625821213 
loss 0.0816068910968582 learn_rate 0.339839071122737 
loss 0.0816071225997455 learn_rate 0.169919535561368 
loss 0.0816004823870125 learn_rate 0.178415512339437 
loss 0.0815952447026722 learn_rate 0.187336287956409 
loss 0.0815910783843995 learn_rate 0.196703102354229 
loss 0.0815877333376605 learn_rate 0.206538257471941 
loss 0.0815850249843411 learn_rate 0.216865170345538 
loss 0.0815828201747185 learn_rate 0.227708428862814 
loss 0.0815810249878371 learn_rate 0.239093850305955 
loss 0.0815795746397702 learn_rate 0.251048542821253 
loss 0.0815784255289325 learn_rate 0.263600969962316 
loss 0.0815775492892223 learn_rate 0.276781018460431 
loss 0.0815769286063249 learn_rate 0.290620069383453 
loss 0.0815765544843953 learn_rate 0.305151072852626 
loss 0.0815764246275834 learn_rate 0.320408626495257 
loss 0.0815765426162445 learn_rate 0.160204313247628 
loss 0.0815710574500425 learn_rate 0.16821452891001 
loss 0.0815666506172211 learn_rate 0.17662525535551 
loss 0.0815630853511822 learn_rate 0.185456518123286 
loss 0.0815601784997282 learn_rate 0.19472934402945 
loss 0.081557791757272 learn_rate 0.204465811230923 
loss 0.0815558232577148 learn_rate 0.214689101792469 
loss 0.0815541999035663 learn_rate 0.225423556882092 
loss 0.0815528706960638 learn_rate 0.236694734726197 
loss 0.0815518012144092 learn_rate 0.248529471462507 
loss 0.0815509692816458 learn_rate 0.260955945035632 
loss 0.0815503617603094 learn_rate 0.274003742287414 
loss 0.0815499723499141 learn_rate 0.287703929401785 
loss 0.0815498002140105 learn_rate 0.302089125871874 
loss 0.0815498492467762 learn_rate 0.151044562935937 
loss 0.0815453192467574 learn_rate 0.158596791082734 
loss 0.081541615232109 learn_rate 0.16652663063687 
loss 0.0815385689816889 learn_rate 0.174852962168714 
loss 0.0815360474058588 learn_rate 0.18359561027715 
loss 0.0815339478204916 learn_rate 0.192775390791007 
loss 0.0815321931031894 learn_rate 0.202414160330558 
loss 0.081530727010905 learn_rate 0.212534868347085 
loss 0.0815295098875588 learn_rate 0.22316161176444 
loss 0.0815285149275501 learn_rate 0.234319692352662 
loss 0.081527725093295 learn_rate 0.246035676970295 
loss 0.0815271307191625 learn_rate 0.25833746081881 
loss 0.0815267277767109 learn_rate 0.27125433385975 
loss 0.0815265167315786 learn_rate 0.284817050552738 
loss 0.0815265018933422 learn_rate 0.299057903080374 
loss 0.0815266911465036 learn_rate 0.149528951540187 
loss 0.0815224289202617 learn_rate 0.157005399117197 
loss 0.0815189441309336 learn_rate 0.164855669073056 
loss 0.0815160812602873 learn_rate 0.173098452526709 
loss 0.0815137168492837 learn_rate 0.181753375153045 
loss 0.0815117553494614 learn_rate 0.190841043910697 
loss 0.0815101248148455 learn_rate 0.200383096106232 
loss 0.0815087726922156 learn_rate 0.210402250911543 
loss 0.0815076619251337 learn_rate 0.220922363457121 
loss 0.0815067675325857 learn_rate 0.231968481629977 
loss 0.0815060737626276 learn_rate 0.243566905711475 
loss 0.0815055718614313 learn_rate 0.255745250997049 
loss 0.0815052584445362 learn_rate 0.268532513546902 
loss 0.0815051344143854 learn_rate 0.281959139224247 
loss 0.081505204339172 learn_rate 0.140979569612123 
loss 0.0815016534978986 learn_rate 0.14802854809273 
loss 0.0814987029872228 learn_rate 0.155429975497366 
loss 0.0814962416738094 learn_rate 0.163201474272234 
loss 0.0814941794638918 learn_rate 0.171361547985846 
loss 0.0814924452202391 learn_rate 0.179929625385138 
loss 0.0814909843904306 learn_rate 0.188926106654395 
loss 0.0814897565182174 learn_rate 0.198372411987115 
loss 0.0814887327969866 learn_rate 0.208291032586471 
loss 0.081487893800531 learn_rate 0.218705584215794 
loss 0.0814872274943463 learn_rate 0.229640863426584 
loss 0.0814867275942404 learn_rate 0.241122906597913 
loss 0.0814863923020519 learn_rate 0.253179051927809 
loss 0.0814862234144708 learn_rate 0.265838004524199 
loss 0.0814862257733402 learn_rate 0.1329190022621 
loss 0.0814832744741513 learn_rate 0.139564952375205 
loss 0.0814807835189207 learn_rate 0.146543199993965 
loss 0.0814786743107163 learn_rate 0.153870359993663 
loss 0.0814768818702986 learn_rate 0.161563877993346 
loss 0.0814753539260123 learn_rate 0.169642071893014 
loss 0.0814740497120337 learn_rate 0.178124175487664 
loss 0.0814729385798406 learn_rate 0.187030384262048 
loss 0.0814719985294577 learn_rate 0.19638190347515 
loss 0.0814712147605949 learn_rate 0.206200998648907 
loss 0.0814705783301608 learn_rate 0.216511048581353 
loss 0.0814700849834823 learn_rate 0.227336601010421 
loss 0.0814697342041125 learn_rate 0.238703431060942 
loss 0.0814695285037896 learn_rate 0.250638602613989 
loss 0.0814694729523501 learn_rate 0.263170532744688 
loss 0.081469574929195 learn_rate 0.131585266372344 
loss 0.0814667967435999 learn_rate 0.138164529690961 
loss 0.0814644511410655 learn_rate 0.145072756175509 
loss 0.0814624657553901 learn_rate 0.152326393984285 
loss 0.0814607805970582 learn_rate 0.159942713683499 
loss 0.0814593472974535 learn_rate 0.167939849367674 
loss 0.0814581280730047 learn_rate 0.176336841836058 
loss 0.0814570945029895 learn_rate 0.185153683927861 
loss 0.0814562262178035 learn_rate 0.194411368124254 
loss 0.0814555095901113 learn_rate 0.204131936530466 
loss 0.0814549365101482 learn_rate 0.21433853335699 
loss 0.0814545033099401 learn_rate 0.225055460024839 
loss 0.0814542098812224 learn_rate 0.236308233026081 
loss 0.0814540590106708 learn_rate 0.248123644677385 
loss 0.0814540559357895 learn_rate 0.260529826911254 
loss 0.0814542081074981 learn_rate 0.130264913455627 
loss 0.0814515689428844 learn_rate 0.136778159128409 
loss 0.0814493409470127 learn_rate 0.143617067084829 
loss 0.0814474566281377 learn_rate 0.15079792043907 
loss 0.0814458598729019 learn_rate 0.158337816461024 
loss 0.0814445053183974 learn_rate 0.166254707284075 
loss 0.0814433574531973 learn_rate 0.174567442648279 
loss 0.0814423895318321 learn_rate 0.183295814780693 
loss 0.0814415823913974 learn_rate 0.192460605519728 
loss 0.0814409232563351 learn_rate 0.202083635795714 
loss 0.0814404046083729 learn_rate 0.2121878175855 
loss 0.0814400231843147 learn_rate 0.222797208464775 
loss 0.0814397791465392 learn_rate 0.233937068888013 
loss 0.0814396754516247 learn_rate 0.245633922332414 
loss 0.0814397174236044 learn_rate 0.122816961166207 
loss 0.0814374963565816 learn_rate 0.128957809224517 
loss 0.0814355947163053 learn_rate 0.135405699685743 
loss 0.0814339643178293 learn_rate 0.14217598467003 
loss 0.0814325643581172 learn_rate 0.149284783903532 
loss 0.081431361273727 learn_rate 0.156749023098709 
loss 0.081430328375559 learn_rate 0.164586474253644 
loss 0.0814294453068976 learn_rate 0.172815797966326 
loss 0.0814286973787974 learn_rate 0.181456587864643 
loss 0.0814280748404105 learn_rate 0.190529417257875 
loss 0.0814275721408961 learn_rate 0.200055888120768 
loss 0.081427187234317 learn_rate 0.210058682526807 
loss 0.0814269209700682 learn_rate 0.220561616653147 
loss 0.0814267765999408 learn_rate 0.231589697485805 
loss 0.0814267594201797 learn_rate 0.243169182360095 
loss 0.081426876554199 learn_rate 0.121584591180047 
loss 0.0814247847645333 learn_rate 0.12766382073905 
loss 0.0814229927975175 learn_rate 0.134047011776002 
loss 0.081421456422636 learn_rate 0.140749362364802 
loss 0.0814201381021546 learn_rate 0.147786830483043 
loss 0.0814190069055735 learn_rate 0.155176172007195 
loss 0.0814180382154816 learn_rate 0.162934980607554 
loss 0.081417213265126 learn_rate 0.171081729637932 
loss 0.0814165185558437 learn_rate 0.179635816119829 
loss 0.0814159452065508 learn_rate 0.18861760692582 
loss 0.0814154882874222 learn_rate 0.198048487272111 
loss 0.0814151461858681 learn_rate 0.207950911635717 
loss 0.0814149200454066 learn_rate 0.218348457217503 
loss 0.0814148133079746 learn_rate 0.229265880078378 
loss 0.0814148313786922 learn_rate 0.114632940039189 
loss 0.0814130619754984 learn_rate 0.120364587041148 
loss 0.0814115261142797 learn_rate 0.126382816393206 
loss 0.0814101922725255 learn_rate 0.132701957212866 
loss 0.0814090332314486 learn_rate 0.139337055073509 
loss 0.0814080261972658 learn_rate 0.146303907827185 
loss 0.0814071527656509 learn_rate 0.153619103218544 
loss 0.0814063987479435 learn_rate 0.161300058379471 
loss 0.0814057538853604 learn_rate 0.169365061298445 
loss 0.0814052114831095 learn_rate 0.177833314363367 
loss 0.0814047679994719 learn_rate 0.186724980081535 
loss 0.0814044226253351 learn_rate 0.196061229085612 
loss 0.0814041768873529 learn_rate 0.205864290539893 
loss 0.0814040343031505 learn_rate 0.216157505066888 
loss 0.0814040001103649 learn_rate 0.226965380320232 
loss 0.0814040810835311 learn_rate 0.113482690160116 
loss 0.0814024154071865 learn_rate 0.119156824668122 
loss 0.0814009685320765 learn_rate 0.125114665901528 
loss 0.081399711637957 learn_rate 0.131370399196604 
loss 0.0813986197957453 learn_rate 0.137938919156434 
loss 0.0813976721061182 learn_rate 0.144835865114256 
loss 0.0813968516934707 learn_rate 0.152077658369969 
loss 0.0813961455707456 learn_rate 0.159681541288467 
loss 0.0813955443979556 learn_rate 0.167665618352891 
loss 0.0813950421627774 learn_rate 0.176048899270535 
loss 0.081394635814975 learn_rate 0.184851344234062 
loss 0.0813943248872969 learn_rate 0.194093911445765 
loss 0.081394111133842 learn_rate 0.203798607018054 
loss 0.0813939982129544 learn_rate 0.213988537368956 
loss 0.0813939914359002 learn_rate 0.224687964237404 
loss 0.0813940975955922 learn_rate 0.112343982118702 
loss 0.081392509434249 learn_rate 0.117961181224637 
loss 0.0813911296719856 learn_rate 0.123859240285869 
loss 0.081389931452388 learn_rate 0.130052202300162 
loss 0.0813888914894837 learn_rate 0.136554812415171 
loss 0.0813879902226487 learn_rate 0.143382553035929 
loss 0.0813872118361205 learn_rate 0.150551680687726 
loss 0.0813865441561083 learn_rate 0.158079264722112 
loss 0.0813859784455027 learn_rate 0.165983227958217 
loss 0.0813855091217262 learn_rate 0.174282389356128 
loss 0.0813851334268315 learn_rate 0.182996508823935 
loss 0.0813848510802391 learn_rate 0.192146334265132 
loss 0.0813846639434515 learn_rate 0.201753650978388 
 training_time 00:00:00.0489350 
memory 0
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [1s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.03
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 5 ratings, sparsity 44.44444
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=40 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.14475 MAE 2.96206 NMAE 0.74051 testing_time 00:00:00.0025370
predicting_time 00:00:00.0031230
memory 0
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [1s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.12
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 4 ratings, sparsity 55.55556
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=40 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 2.0334 MAE 1.22007 NMAE 0.30502 testing_time 00:00:00.0025150
predicting_time 00:00:00.0936490
memory 0
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [1s]


real	0m8.989s
user	0m6.732s
sys	0m1.988s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.