ServerRun 14101
Creatorzenogantner
ProgramMyMediaLite-biased-matrix-factorization-k-60
Datasetcollaborativefiltering-sample
Task typeCollaborativeFiltering
Created5y355d ago
DownloadLogin required!
Done! Flag_green
4s
423M
CollaborativeFiltering
0.064
0.044
2.27
1.78

Log file

... (lines omitted) ...
loss 0.0833711247864593 learn_rate 1.37763001617288 
loss 0.0833408048018678 learn_rate 1.44651151698152 
loss 0.0833130449429058 learn_rate 1.5188370928306 
loss 0.0832882207038353 learn_rate 1.59477894747213 
loss 0.0832667379923492 learn_rate 1.67451789484574 
loss 0.0832490349603314 learn_rate 1.75824378958802 
loss 0.0832355839341926 learn_rate 1.84615597906742 
loss 0.0832268934613571 learn_rate 1.93846377802079 
loss 0.0832235104934356 learn_rate 2.03538696692183 
loss 0.0832260227345821 learn_rate 1.01769348346092 
loss 0.0824506303762114 learn_rate 1.06857815763396 
loss 0.0825097477236211 learn_rate 0.534289078816982 
loss 0.0823743675953624 learn_rate 0.561003532757831 
loss 0.0823265312309843 learn_rate 0.589053709395722 
loss 0.0823039048453858 learn_rate 0.618506394865509 
loss 0.0822896937030733 learn_rate 0.649431714608784 
loss 0.0822785225155949 learn_rate 0.681903300339223 
loss 0.0822684901023949 learn_rate 0.715998465356184 
loss 0.0822589540137263 learn_rate 0.751798388623994 
loss 0.0822497490070064 learn_rate 0.789388308055193 
loss 0.0822408824364115 learn_rate 0.828857723457953 
loss 0.0822324211505263 learn_rate 0.870300609630851 
loss 0.0822244547991026 learn_rate 0.913815640112393 
loss 0.0822170863012226 learn_rate 0.959506422118013 
loss 0.0822104304807329 learn_rate 1.00748174322391 
loss 0.0822046147174101 learn_rate 1.05785583038511 
loss 0.0821997800286412 learn_rate 1.11074862190437 
loss 0.0821960822830124 learn_rate 1.16628605299958 
loss 0.0821936935150795 learn_rate 1.22460035564956 
loss 0.0821928033473053 learn_rate 1.28583037343204 
loss 0.0821936205282987 learn_rate 0.642915186716021 
loss 0.0820138632745414 learn_rate 0.675060946051822 
loss 0.0819736041943832 learn_rate 0.708813993354413 
loss 0.0819608311615254 learn_rate 0.744254693022133 
loss 0.0819547800218456 learn_rate 0.78146742767324 
loss 0.0819505815191395 learn_rate 0.820540799056902 
loss 0.081947134899689 learn_rate 0.861567839009747 
loss 0.0819442860831745 learn_rate 0.904646230960235 
loss 0.0819420915017778 learn_rate 0.949878542508247 
loss 0.0819406509950951 learn_rate 0.997372469633659 
loss 0.0819400794024348 learn_rate 1.04724109311534 
loss 0.081940504174772 learn_rate 0.523620546557671 
loss 0.0818479373674107 learn_rate 0.549801573885555 
loss 0.0818163461605373 learn_rate 0.577291652579832 
loss 0.0818033219003585 learn_rate 0.606156235208824 
loss 0.0817968576693379 learn_rate 0.636464046969265 
loss 0.0817929601561672 learn_rate 0.668287249317728 
loss 0.0817901669884906 learn_rate 0.701701611783615 
loss 0.0817879639575646 learn_rate 0.736786692372796 
loss 0.0817862112738261 learn_rate 0.773626026991435 
loss 0.0817849115159238 learn_rate 0.812307328341007 
loss 0.0817841190107091 learn_rate 0.852922694758058 
loss 0.0817839085188956 learn_rate 0.895568829495961 
loss 0.0817843664125912 learn_rate 0.44778441474798 
loss 0.0817290851743747 learn_rate 0.470173635485379 
loss 0.0817056963124367 learn_rate 0.493682317259648 
loss 0.0816945046687649 learn_rate 0.518366433122631 
loss 0.0816884775038135 learn_rate 0.544284754778762 
loss 0.0816848484577515 learn_rate 0.571498992517701 
loss 0.08168241713706 learn_rate 0.600073942143586 
loss 0.0816806441804083 learn_rate 0.630077639250765 
loss 0.0816793028491531 learn_rate 0.661581521213303 
loss 0.0816783200687638 learn_rate 0.694660597273968 
loss 0.0816776971443861 learn_rate 0.729393627137667 
loss 0.081677470735132 learn_rate 0.76586330849455 
loss 0.0816776951631481 learn_rate 0.382931654247275 
loss 0.0816443267270702 learn_rate 0.402078236959639 
loss 0.0816276365534152 learn_rate 0.422182148807621 
loss 0.0816185597449316 learn_rate 0.443291256248002 
loss 0.0816131952896907 learn_rate 0.465455819060402 
loss 0.0816097801700703 learn_rate 0.488728610013422 
loss 0.0816074586026861 learn_rate 0.513165040514093 
loss 0.0816057911506405 learn_rate 0.538823292539798 
loss 0.0816045514857058 learn_rate 0.565764457166788 
loss 0.0816036324008261 learn_rate 0.594052680025127 
loss 0.0816029958629617 learn_rate 0.623755314026384 
loss 0.0816026440547629 learn_rate 0.654943079727703 
loss 0.0816026024735288 learn_rate 0.687690233714088 
loss 0.081602910601281 learn_rate 0.343845116857044 
loss 0.081579637212795 learn_rate 0.361037372699896 
loss 0.0815668924916913 learn_rate 0.379089241334891 
loss 0.081559465658195 learn_rate 0.398043703401636 
loss 0.0815548571230871 learn_rate 0.417945888571717 
loss 0.0815518363506455 learn_rate 0.438843183000303 
loss 0.0815497666703836 learn_rate 0.460785342150318 
loss 0.0815483008561228 learn_rate 0.483824609257834 
loss 0.0815472448435123 learn_rate 0.508015839720726 
loss 0.0815464936925195 learn_rate 0.533416631706762 
loss 0.0815459980230582 learn_rate 0.5600874632921 
loss 0.0815457441300531 learn_rate 0.588091836456705 
loss 0.0815457413403391 learn_rate 0.617496428279541 
loss 0.0815460139740829 learn_rate 0.30874821413977 
loss 0.081529687777125 learn_rate 0.324185624846759 
loss 0.0815200106421438 learn_rate 0.340394906089097 
loss 0.0815139992878001 learn_rate 0.357414651393552 
loss 0.0815100795714764 learn_rate 0.375285383963229 
loss 0.0815074130504459 learn_rate 0.394049653161391 
loss 0.081505539516206 learn_rate 0.41375213581946 
loss 0.0815041956614759 learn_rate 0.434439742610433 
loss 0.0815032259697831 learn_rate 0.456161729740955 
loss 0.0815025389447723 learn_rate 0.478969816228003 
loss 0.0815020844521275 learn_rate 0.502918307039403 
loss 0.0815018406807642 learn_rate 0.528064222391373 
loss 0.0815018057412818 learn_rate 0.554467433510942 
loss 0.0815019918998371 learn_rate 0.277233716755471 
loss 0.0814904830726034 learn_rate 0.291095402593245 
loss 0.0814831692375657 learn_rate 0.305650172722907 
loss 0.0814783515243743 learn_rate 0.320932681359052 
loss 0.0814750550559453 learn_rate 0.336979315427005 
loss 0.0814727216309536 learn_rate 0.353828281198355 
loss 0.0814710271630478 learn_rate 0.371519695258273 
loss 0.0814697782572496 learn_rate 0.390095680021186 
loss 0.0814688560992829 learn_rate 0.409600464022246 
loss 0.0814681869093562 learn_rate 0.430080487223358 
loss 0.0814677263770266 learn_rate 0.451584511584526 
loss 0.0814674510288289 learn_rate 0.474163737163752 
loss 0.0814673529207058 learn_rate 0.49787192402194 
loss 0.081467435991539 learn_rate 0.24893596201097 
loss 0.0814592857319104 learn_rate 0.261382760111519 
loss 0.0814537771073886 learn_rate 0.274451898117094 
loss 0.081449948725492 learn_rate 0.288174493022949 
loss 0.0814472064944458 learn_rate 0.302583217674097 
loss 0.0814451870050603 learn_rate 0.317712378557802 
loss 0.0814436676588406 learn_rate 0.333597997485692 
loss 0.0814425102585622 learn_rate 0.350277897359976 
loss 0.0814416272388765 learn_rate 0.367791792227975 
loss 0.0814409623234015 learn_rate 0.386181381839374 
loss 0.0814404798014888 learn_rate 0.405490450931343 
loss 0.0814401586131073 learn_rate 0.42576497347791 
loss 0.0814399889359971 learn_rate 0.447053222151805 
loss 0.0814399700066762 learn_rate 0.469405883259395 
loss 0.0814401085561645 learn_rate 0.234702941629698 
loss 0.0814334604135646 learn_rate 0.246438088711183 
loss 0.0814288435382719 learn_rate 0.258759993146742 
loss 0.0814255628274441 learn_rate 0.271697992804079 
loss 0.0814231726608527 learn_rate 0.285282892444283 
loss 0.0814213911708468 learn_rate 0.299547037066497 
loss 0.0814200408191022 learn_rate 0.314524388919822 
loss 0.0814190090689969 learn_rate 0.330250608365813 
loss 0.0814182235743452 learn_rate 0.346763138784104 
loss 0.081417637257226 learn_rate 0.364101295723309 
loss 0.0814172197125469 learn_rate 0.382306360509474 
loss 0.0814169523927971 learn_rate 0.401421678534948 
loss 0.0814168258897852 learn_rate 0.421492762461695 
loss 0.0814168382948662 learn_rate 0.210746381230848 
loss 0.0814120932312344 learn_rate 0.22128370029239 
loss 0.0814086239877475 learn_rate 0.23234788530701 
loss 0.0814060409289539 learn_rate 0.24396527957236 
loss 0.0814040784844727 learn_rate 0.256163543550978 
loss 0.0814025588476626 learn_rate 0.268971720728527 
loss 0.0814013644137626 learn_rate 0.282420306764953 
loss 0.0814004177743583 learn_rate 0.296541322103201 
loss 0.0813996678695923 learn_rate 0.311368388208361 
loss 0.0813990808723875 learn_rate 0.326936807618779 
loss 0.0813986344951514 learn_rate 0.343283647999718 
loss 0.081398314616011 learn_rate 0.360447830399704 
loss 0.0813981133674952 learn_rate 0.378470221919689 
loss 0.0813980280739482 learn_rate 0.397393733015674 
loss 0.0813980606363797 learn_rate 0.198696866507837 
loss 0.0813941691638715 learn_rate 0.208631709833229 
loss 0.0813912576150765 learn_rate 0.21906329532489 
loss 0.0813890457966943 learn_rate 0.230016460091135 
loss 0.0813873368269185 learn_rate 0.241517283095691 
loss 0.0813859950614532 learn_rate 0.253593147250476 
loss 0.0813849285005516 learn_rate 0.266272804613 
loss 0.081384075380947 learn_rate 0.27958644484365 
loss 0.0813833944271145 learn_rate 0.293565767085832 
loss 0.0813828581110814 learn_rate 0.308244055440124 
loss 0.0813824482390981 learn_rate 0.32365625821213 
loss 0.0813821532269031 learn_rate 0.339839071122737 
loss 0.0813819665181522 learn_rate 0.356831024678874 
loss 0.0813818857176385 learn_rate 0.374672575912817 
loss 0.0813819121308423 learn_rate 0.187336287956409 
loss 0.0813787093727409 learn_rate 0.196703102354229 
loss 0.0813762604803867 learn_rate 0.206538257471941 
loss 0.0813743639342364 learn_rate 0.216865170345538 
loss 0.0813728740398498 learn_rate 0.227708428862814 
loss 0.081371687675031 learn_rate 0.239093850305955 
loss 0.081370733200612 learn_rate 0.251048542821253 
loss 0.0813699615906176 learn_rate 0.263600969962316 
loss 0.0813693396628376 learn_rate 0.276781018460431 
loss 0.0813688451649829 learn_rate 0.290620069383453 
loss 0.0813684633983681 learn_rate 0.305151072852626 
loss 0.0813681850367981 learn_rate 0.320408626495257 
loss 0.0813680048141024 learn_rate 0.33642905782002 
loss 0.0813679207973288 learn_rate 0.353250510711021 
loss 0.0813679340211795 learn_rate 0.17662525535551 
loss 0.0813652919380551 learn_rate 0.185456518123286 
loss 0.0813632297798204 learn_rate 0.19472934402945 
loss 0.0813616028619882 learn_rate 0.204465811230923 
loss 0.0813603037499016 learn_rate 0.214689101792469 
loss 0.0813592544442967 learn_rate 0.225423556882092 
loss 0.0813583995072775 learn_rate 0.236694734726197 
loss 0.0813577002896738 learn_rate 0.248529471462507 
loss 0.0813571303073726 learn_rate 0.260955945035632 
loss 0.08135667171856 learn_rate 0.274003742287414 
loss 0.0813563127827678 learn_rate 0.287703929401785 
loss 0.0813560461390228 learn_rate 0.302089125871874 
loss 0.0813558677232287 learn_rate 0.317193582165468 
loss 0.0813557761500046 learn_rate 0.333053261273741 
loss 0.0813557724054854 learn_rate 0.349705924337428 
loss 0.0813558597282652 learn_rate 0.174852962168714 
loss 0.081353364338921 learn_rate 0.18359561027715 
loss 0.0813514176307705 learn_rate 0.192775390791007 
loss 0.0813498848180294 learn_rate 0.202414160330558 
loss 0.0813486653928064 learn_rate 0.212534868347085 
loss 0.0813476860979652 learn_rate 0.22316161176444 
loss 0.0813468946907169 learn_rate 0.234319692352662 
loss 0.0813462546609303 learn_rate 0.246035676970295 
loss 0.0813457409655072 learn_rate 0.25833746081881 
loss 0.0813453367515541 learn_rate 0.27125433385975 
loss 0.0813450309732881 learn_rate 0.284817050552738 
loss 0.0813448167639491 learn_rate 0.299057903080374 
loss 0.0813446904042224 learn_rate 0.314010798234393 
loss 0.0813446507296328 learn_rate 0.329711338146113 
loss 0.0813446988359272 learn_rate 0.164855669073056 
loss 0.0813426295447725 learn_rate 0.173098452526709 
loss 0.081340983978064 learn_rate 0.181753375153045 
loss 0.0813396651949839 learn_rate 0.190841043910697 
loss 0.0813385990968427 learn_rate 0.200383096106232 
loss 0.0813377303825521 learn_rate 0.210402250911543 
loss 0.081337018759645 learn_rate 0.220922363457121 
loss 0.0813364355681903 learn_rate 0.231968481629977 
loss 0.0813359609167634 learn_rate 0.243566905711475 
loss 0.0813355813725087 learn_rate 0.255745250997049 
loss 0.081335288196927 learn_rate 0.268532513546902 
loss 0.0813350760793919 learn_rate 0.281959139224247 
loss 0.0813349422938954 learn_rate 0.296057096185459 
loss 0.0813348861913722 learn_rate 0.310859950994732 
loss 0.0813349089388265 learn_rate 0.155429975497366 
loss 0.0813331932755586 learn_rate 0.163201474272234 
loss 0.081331803679872 learn_rate 0.171361547985846 
loss 0.081330670835012 learn_rate 0.179929625385138 
loss 0.0813297405133467 learn_rate 0.188926106654395 
loss 0.0813289713343777 learn_rate 0.198372411987115 
loss 0.0813283325316697 learn_rate 0.208291032586471 
loss 0.0813278018512352 learn_rate 0.218705584215794 
loss 0.0813273636750114 learn_rate 0.229640863426584 
loss 0.0813270074304075 learn_rate 0.241122906597913 
loss 0.0813267263141607 learn_rate 0.253179051927809 
loss 0.0813265163291846 learn_rate 0.265838004524199 
loss 0.081326375609312 learn_rate 0.279129904750409 
loss 0.0813263039903217 learn_rate 0.29308639998793 
loss 0.081326302776892 learn_rate 0.307740719987326 
loss 0.0813263746535167 learn_rate 0.153870359993663 
loss 0.0813247517616704 learn_rate 0.161563877993346 
loss 0.0813234372367969 learn_rate 0.169642071893014 
loss 0.0813223666331721 learn_rate 0.178124175487664 
loss 0.0813214893542341 learn_rate 0.187030384262048 
loss 0.0813207666637046 learn_rate 0.19638190347515 
loss 0.0813201696814672 learn_rate 0.206200998648907 
loss 0.0813196774785552 learn_rate 0.216511048581353 
loss 0.0813192753607364 learn_rate 0.227336601010421 
loss 0.0813189534012754 learn_rate 0.238703431060942 
loss 0.0813187052538081 learn_rate 0.250638602613989 
loss 0.0813185272489142 learn_rate 0.263170532744688 
loss 0.0813184177553899 learn_rate 0.276329059381922 
loss 0.081318376771009 learn_rate 0.290145512351019 
loss 0.081318405698295 learn_rate 0.145072756175509 
loss 0.0813170501288855 learn_rate 0.152326393984285 
loss 0.0813159333664289 learn_rate 0.159942713683499 
loss 0.0813150091373354 learn_rate 0.167939849367674 
loss 0.0813142403181906 learn_rate 0.176336841836058 
loss 0.0813135978926091 learn_rate 0.185153683927861 
loss 0.0813130598135124 learn_rate 0.194411368124254 
loss 0.0813126098497609 learn_rate 0.204131936530466 
loss 0.0813122364864601 learn_rate 0.21433853335699 
loss 0.0813119319341349 learn_rate 0.225055460024839 
loss 0.0813116912851019 learn_rate 0.236308233026081 
loss 0.0813115118376617 learn_rate 0.248123644677385 
loss 0.0813113925920416 learn_rate 0.260529826911254 
loss 0.0813113339079433 learn_rate 0.273556318256817 
loss 0.081311337303116 learn_rate 0.136778159128409 
loss 0.0813102072098303 learn_rate 0.143617067084829 
loss 0.0813092608766979 learn_rate 0.15079792043907 
loss 0.081308465387705 learn_rate 0.158337816461024 
loss 0.0813077937954915 learn_rate 0.166254707284075 
loss 0.0813072246262017 learn_rate 0.174567442648279 
loss 0.081306741273978 learn_rate 0.183295814780693 
loss 0.0813063313350916 learn_rate 0.192460605519728 
loss 0.081305985930039 learn_rate 0.202083635795714 
loss 0.0813056990566432 learn_rate 0.2121878175855 
loss 0.081305467008988 learn_rate 0.222797208464775 
loss 0.08130528788692 learn_rate 0.233937068888013 
loss 0.0813051612099999 learn_rate 0.245633922332414 
loss 0.0813050876393489 learn_rate 0.257915618449035 
loss 0.0813050688018365 learn_rate 0.270811399371487 
loss 0.0813051072042031 learn_rate 0.135405699685743 
loss 0.0813040382013273 learn_rate 0.14217598467003 
loss 0.0813031426042134 learn_rate 0.149284783903532 
loss 0.081302389941404 learn_rate 0.156749023098709 
loss 0.0813017551948321 learn_rate 0.164586474253644 
loss 0.0813012183774022 learn_rate 0.172815797966326 
loss 0.0813007640027641 learn_rate 0.181456587864643 
loss 0.0813003804924427 learn_rate 0.190529417257875 
loss 0.0813000595646739 learn_rate 0.200055888120768 
loss 0.0812997956451148 learn_rate 0.210058682526807 
loss 0.0812995853326183 learn_rate 0.220561616653147 
loss 0.081299426944344 learn_rate 0.231589697485805 
loss 0.0812993201546881 learn_rate 0.243169182360095 
loss 0.0812992657328613 learn_rate 0.2553276414781 
loss 0.0812992653754494 learn_rate 0.268094023552005 
loss 0.0812993216236198 learn_rate 0.134047011776002 
loss 0.0812983022605293 learn_rate 0.140749362364802 
loss 0.0812974481670142 learn_rate 0.147786830483043 
loss 0.0812967308186507 learn_rate 0.155176172007195 
loss 0.0812961267239828 learn_rate 0.162934980607554 
loss 0.0812956170629138 learn_rate 0.171081729637932 
loss 0.0812951872192405 learn_rate 0.179635816119829 
loss 0.0812948262484254 learn_rate 0.18861760692582 
loss 0.0812945263216177 learn_rate 0.198048487272111 
loss 0.0812942821836838 learn_rate 0.207950911635717 
loss 0.0812940906570699 learn_rate 0.218348457217503 
loss 0.0812939502154221 learn_rate 0.229265880078378 
loss 0.0812938606419854 learn_rate 0.240729174082297 
loss 0.0812938227788446 learn_rate 0.252765632786412 
loss 0.0812938383649865 learn_rate 0.126382816393206 
loss 0.0812929791290647 learn_rate 0.132701957212866 
loss 0.0812922484611703 learn_rate 0.139337055073509 
loss 0.0812916259146624 learn_rate 0.146303907827185 
loss 0.081291094329654 learn_rate 0.153619103218544 
loss 0.0812906397091274 learn_rate 0.161300058379471 
loss 0.0812902510034542 learn_rate 0.169365061298445 
loss 0.0812899198269662 learn_rate 0.177833314363367 
loss 0.0812896401326129 learn_rate 0.186724980081535 
loss 0.0812894078710171 learn_rate 0.196061229085612 
loss 0.0812892206584845 learn_rate 0.205864290539893 
loss 0.0812890774749381 learn_rate 0.216157505066888 
loss 0.0812889784078485 learn_rate 0.226965380320232 
loss 0.0812889244525168 learn_rate 0.238313649336244 
loss 0.0812889173732417 learn_rate 0.250229331803056 
loss 0.0812889596245012 learn_rate 0.125114665901528 
loss 0.0812881467164973 learn_rate 0.131370399196604 
loss 0.0812874549360222 learn_rate 0.137938919156434 
loss 0.081286865415942 learn_rate 0.144835865114256 
loss 0.0812863622832953 learn_rate 0.152077658369969 
loss 0.0812859325647129 learn_rate 0.159681541288467 
loss 0.0812855660052756 learn_rate 0.167665618352891 
loss 0.0812852548217284 learn_rate 0.176048899270535 
loss 0.0812849934135041 learn_rate 0.184851344234062 
loss 0.0812847780556441 learn_rate 0.194093911445765 
loss 0.0812846065964556 learn_rate 0.203798607018054 
loss 0.081284478179774 learn_rate 0.213988537368956 
loss 0.0812843930074227 learn_rate 0.224687964237404 
loss 0.0812843521523449 learn_rate 0.235922362449274 
loss 0.0812843574275249 learn_rate 0.117961181224637 
loss 0.0812836689090465 learn_rate 0.123859240285869 
loss 0.0812830748181386 learn_rate 0.130052202300162 
loss 0.0812825616532173 learn_rate 0.136554812415171 
loss 0.0812821178518096 learn_rate 0.143382553035929 
loss 0.0812817338036578 learn_rate 0.150551680687726 
loss 0.0812814017966248 learn_rate 0.158079264722112 
loss 0.0812811159060009 learn_rate 0.165983227958217 
loss 0.0812808718407549 learn_rate 0.174282389356128 
loss 0.0812806667621138 learn_rate 0.182996508823935 
loss 0.0812804990904943 learn_rate 0.192146334265132 
loss 0.0812803683161347 learn_rate 0.201753650978388 
 training_time 00:00:00.1234310 
memory 0
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [1s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.09
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 5 ratings, sparsity 44.44444
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=60 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.14598 MAE 2.96303 NMAE 0.74076 testing_time 00:00:00.0025490
predicting_time 00:00:00.0033930
memory 0
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [1s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.03
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 4 ratings, sparsity 55.55556
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=60 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 2.03226 MAE 1.21603 NMAE 0.30401 testing_time 00:00:00.0025970
predicting_time 00:00:00.0033450
memory 0
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [1s]


real	0m9.156s
user	0m6.684s
sys	0m2.144s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.