ServerRun 14103
Creatorzenogantner
ProgramMyMediaLite-biased-matrix-factorization-k-5
Datasetcollaborativefiltering-sample
Task typeCollaborativeFiltering
Created5y299d ago
DownloadLogin required!
Done! Flag_green
10s
425M
CollaborativeFiltering
0.061
0.042
2.27
1.79

Log file

... (lines omitted) ...
loss 0.0813015316057112 learn_rate 1.37763001617288 
loss 0.0812983380023545 learn_rate 1.44651151698152 
loss 0.0812949764761751 learn_rate 1.5188370928306 
loss 0.081291587531012 learn_rate 1.59477894747213 
loss 0.0812880758700703 learn_rate 1.67451789484574 
loss 0.0812845255966698 learn_rate 1.75824378958802 
loss 0.0812808731725165 learn_rate 1.84615597906742 
loss 0.0812771962804986 learn_rate 1.93846377802079 
loss 0.0812734292026891 learn_rate 2.03538696692183 
loss 0.0812696750206261 learn_rate 2.13715631526793 
loss 0.0812658289139374 learn_rate 2.24401413103132 
loss 0.0812620795878636 learn_rate 2.35621483758289 
loss 0.0812581832095221 learn_rate 2.47402557946203 
loss 0.0812546205723328 learn_rate 2.59772685843514 
loss 0.0812506384031176 learn_rate 2.72761320135689 
loss 0.0812480528339135 learn_rate 2.86399386142474 
loss 0.0812444837483122 learn_rate 3.00719355449597 
loss 0.0812536606834841 learn_rate 1.50359677724799 
loss 0.0812413789521233 learn_rate 1.57877661611039 
loss 0.0812411231907452 learn_rate 1.65771544691591 
loss 0.0812361245624425 learn_rate 1.7406012192617 
loss 0.0812373951972254 learn_rate 0.870300609630851 
loss 0.0812320675462724 learn_rate 0.913815640112393 
loss 0.0812315321938693 learn_rate 0.959506422118013 
loss 0.0812309948415462 learn_rate 1.00748174322391 
loss 0.0812304359109229 learn_rate 1.05785583038511 
loss 0.0812298556171894 learn_rate 1.11074862190437 
loss 0.0812292538147752 learn_rate 1.16628605299958 
loss 0.0812286305679105 learn_rate 1.22460035564956 
loss 0.0812279861260335 learn_rate 1.28583037343204 
loss 0.0812273209791151 learn_rate 1.35012189210364 
loss 0.081226635906387 learn_rate 1.41762798670883 
loss 0.0812259320387556 learn_rate 1.48850938604427 
loss 0.0812252109307813 learn_rate 1.56293485534648 
loss 0.0812244746463162 learn_rate 1.6410815981138 
loss 0.0812237258588253 learn_rate 1.72313567801949 
loss 0.0812229679694677 learn_rate 1.80929246192047 
loss 0.081222205245288 learn_rate 1.89975708501649 
loss 0.0812214429810488 learn_rate 1.99474493926732 
loss 0.0812206876880282 learn_rate 2.09448218623068 
loss 0.0812199473142241 learn_rate 2.19920629554222 
loss 0.0812192315002965 learn_rate 2.30916661031933 
loss 0.0812185518770001 learn_rate 2.4246249408353 
loss 0.0812179224094599 learn_rate 2.54585618787706 
loss 0.0812173597959929 learn_rate 2.67314899727091 
loss 0.0812168839274209 learn_rate 2.80680644713446 
loss 0.0812165184182664 learn_rate 2.94714676949118 
loss 0.0812162912134776 learn_rate 3.09450410796574 
loss 0.0812162352935228 learn_rate 3.24922931336403 
loss 0.0812163894609914 learn_rate 1.62461465668201 
loss 0.0812129930842522 learn_rate 1.70584538951612 
loss 0.0812149211700613 learn_rate 0.852922694758058 
loss 0.0812126751269893 learn_rate 0.895568829495961 
loss 0.0812124215736703 learn_rate 0.940347270970759 
loss 0.0812122127840829 learn_rate 0.987364634519297 
loss 0.0812120013206163 learn_rate 1.03673286624526 
loss 0.0812117873190208 learn_rate 1.08856950955752 
loss 0.0812115714930493 learn_rate 1.1429979850354 
loss 0.0812113546861458 learn_rate 1.20014788428717 
loss 0.0812111379113631 learn_rate 1.26015527850153 
loss 0.0812109223810538 learn_rate 1.32316304242661 
loss 0.0812107095432844 learn_rate 1.38932119454794 
loss 0.0812105011252277 learn_rate 1.45878725427533 
loss 0.0812102991847867 learn_rate 1.5317266169891 
loss 0.0812101061721151 learn_rate 1.60831294783856 
loss 0.0812099250027951 learn_rate 1.68872859523048 
loss 0.0812097591448827 learn_rate 1.77316502499201 
loss 0.0812096127222995 learn_rate 1.86182327624161 
loss 0.081209490637549 learn_rate 1.95491444005369 
loss 0.0812093987171481 learn_rate 2.05266016205637 
loss 0.081209343883781 learn_rate 2.15529317015919 
loss 0.0812093343597429 learn_rate 2.26305782866715 
loss 0.0812093799070189 learn_rate 1.13152891433358 
loss 0.0812079948332604 learn_rate 1.18810536005025 
loss 0.0812079939169518 learn_rate 1.24751062805277 
loss 0.0812077212869056 learn_rate 1.30988615945541 
loss 0.0812075630075898 learn_rate 1.37538046742818 
loss 0.0812073657870824 learn_rate 1.44414949079958 
loss 0.0812072022887711 learn_rate 1.51635696533956 
loss 0.0812070368555736 learn_rate 1.59217481360654 
loss 0.0812068965558225 learn_rate 1.67178355428687 
loss 0.081206767772843 learn_rate 1.75537273200121 
loss 0.0812066686884637 learn_rate 1.84314136860127 
loss 0.0812065915743948 learn_rate 1.93529843703134 
loss 0.081206555089388 learn_rate 2.0320633588829 
loss 0.0812065517470489 learn_rate 2.13366652682705 
loss 0.081206607269039 learn_rate 1.06683326341352 
loss 0.0812054130031077 learn_rate 1.1201749265842 
loss 0.081205330161858 learn_rate 1.17618367291341 
loss 0.081205097713355 learn_rate 1.23499285655908 
loss 0.0812049164378232 learn_rate 1.29674249938704 
loss 0.0812047264088412 learn_rate 1.36157962435639 
loss 0.0812045521482332 learn_rate 1.42965860557421 
loss 0.0812043859596057 learn_rate 1.50114153585292 
loss 0.0812042358818396 learn_rate 1.57619861264556 
loss 0.0812041018279162 learn_rate 1.65500854327784 
loss 0.0812039897656792 learn_rate 1.73775897044173 
loss 0.0812039021544214 learn_rate 1.82464691896382 
loss 0.0812038456083322 learn_rate 1.91587926491201 
loss 0.0812038243260292 learn_rate 2.01167322815761 
loss 0.0812038468932432 learn_rate 1.00583661407881 
loss 0.0812028123738173 learn_rate 1.05612844478275 
loss 0.0812026700192728 learn_rate 1.10893486702188 
loss 0.0812024549756523 learn_rate 1.16438161037298 
loss 0.0812022602195358 learn_rate 1.22260069089163 
loss 0.0812020672612885 learn_rate 1.28373072543621 
loss 0.0812018835526069 learn_rate 1.34791726170802 
loss 0.0812017091912728 learn_rate 1.41531312479342 
loss 0.0812015474663055 learn_rate 1.48607878103309 
loss 0.0812014007132942 learn_rate 1.56038272008475 
loss 0.0812012724128972 learn_rate 1.63840185608898 
loss 0.0812011660871218 learn_rate 1.72032194889343 
loss 0.0812010861716549 learn_rate 1.8063380463381 
loss 0.0812010375102552 learn_rate 1.89665494865501 
loss 0.0812010260021072 learn_rate 1.99148769608776 
loss 0.0812010582261593 learn_rate 0.99574384804388 
loss 0.0812000520957834 learn_rate 1.04553104044607 
loss 0.0811998997773515 learn_rate 1.09780759246838 
loss 0.0811996856784179 learn_rate 1.1526979720918 
loss 0.0811994891817133 learn_rate 1.21033287069639 
loss 0.0811992959898482 learn_rate 1.27084951423121 
loss 0.0811991122158909 learn_rate 1.33439198994277 
loss 0.0811989385544121 learn_rate 1.4011115894399 
loss 0.0811987780672244 learn_rate 1.4711671689119 
loss 0.0811986332648583 learn_rate 1.5447255273575 
loss 0.0811985075550828 learn_rate 1.62196180372537 
loss 0.0811984045334134 learn_rate 1.70305989391164 
loss 0.0811983285705171 learn_rate 1.78821288860722 
loss 0.0811982845274741 learn_rate 1.87762353303758 
loss 0.0811982781930507 learn_rate 1.97150470968946 
loss 0.0811983161079876 learn_rate 0.98575235484473 
loss 0.0811973364605188 learn_rate 1.03503997258697 
loss 0.0811971760146922 learn_rate 1.08679197121632 
loss 0.0811969635344031 learn_rate 1.14113156977713 
loss 0.0811967663431905 learn_rate 1.19818814826599 
loss 0.0811965736061459 learn_rate 1.25809755567929 
loss 0.0811963903759536 learn_rate 1.32100243346325 
loss 0.0811962177861078 learn_rate 1.38705255513641 
loss 0.0811960587248182 learn_rate 1.45640518289324 
loss 0.0811959158139185 learn_rate 1.5292254420379 
loss 0.0811957923985904 learn_rate 1.60568671413979 
loss 0.0811956921061496 learn_rate 1.68597104984678 
loss 0.0811956192438185 learn_rate 1.77026960233912 
loss 0.0811955786552956 learn_rate 1.85878308245608 
loss 0.0811955760243094 learn_rate 1.95172223657888 
loss 0.0811956178201066 learn_rate 0.97586111828944 
loss 0.0811946641177082 learn_rate 1.02465417420391 
loss 0.0811944977308565 learn_rate 1.07588688291411 
loss 0.0811942881097975 learn_rate 1.12968122705981 
loss 0.0811940918021234 learn_rate 1.1861652884128 
loss 0.0811939008278358 learn_rate 1.24547355283344 
loss 0.0811937194018746 learn_rate 1.30774723047512 
loss 0.0811935489745233 learn_rate 1.37313459199887 
loss 0.081193392298578 learn_rate 1.44179132159882 
loss 0.0811932520595844 learn_rate 1.51388088767876 
loss 0.0811931315447447 learn_rate 1.58957493206269 
loss 0.0811930343820778 learn_rate 1.66905367866583 
loss 0.0811929648131804 learn_rate 1.75250636259912 
loss 0.0811929276387097 learn_rate 1.84013168072908 
loss 0.0811929284374193 learn_rate 0.920065840364539 
loss 0.0811920953180734 learn_rate 0.966069132382766 
loss 0.0811919011278457 learn_rate 1.0143725890019 
loss 0.0811916999227403 learn_rate 1.065091218452 
loss 0.0811915037788994 learn_rate 1.1183457793746 
loss 0.0811913130731932 learn_rate 1.17426306834333 
loss 0.0811911296560777 learn_rate 1.2329762217605 
loss 0.0811909553761331 learn_rate 1.29462503284852 
loss 0.0811907923967513 learn_rate 1.35935628449095 
loss 0.0811906431560435 learn_rate 1.42732409871549 
loss 0.0811905104368366 learn_rate 1.49869030365127 
loss 0.0811903974057148 learn_rate 1.57362481883383 
loss 0.0811903076791062 learn_rate 1.65230605977552 
loss 0.0811902453904502 learn_rate 1.7349213627643 
loss 0.0811902152769781 learn_rate 1.82166743090252 
loss 0.0811902227773179 learn_rate 0.910833715451258 
loss 0.0811894148161051 learn_rate 0.956375401223821 
loss 0.0811892230662494 learn_rate 1.00419417128501 
loss 0.0811890292392408 learn_rate 1.05440387984926 
loss 0.0811888401371518 learn_rate 1.10712407384173 
loss 0.0811886566771614 learn_rate 1.16248027753381 
loss 0.0811884805963093 learn_rate 1.2206042914105 
loss 0.0811883137447256 learn_rate 1.28163450598103 
loss 0.0811881582508281 learn_rate 1.34571623128008 
loss 0.0811880165214939 learn_rate 1.41300204284408 
loss 0.0811878912957798 learn_rate 1.48365214498629 
loss 0.0811877856887787 learn_rate 1.5578347522356 
loss 0.0811877032521003 learn_rate 1.63572648984738 
loss 0.0811876480406697 learn_rate 1.71751281433975 
loss 0.0811876246948804 learn_rate 1.80338845505674 
loss 0.0811876385358272 learn_rate 0.901694227528369 
loss 0.0811868551794538 learn_rate 0.946778938904788 
loss 0.081186666397457 learn_rate 0.994117885850027 
loss 0.08118648005996 learn_rate 1.04382378014253 
loss 0.0811862982338621 learn_rate 1.09601496914966 
loss 0.0811861221971671 learn_rate 1.15081571760714 
loss 0.0811859536070948 learn_rate 1.2083565034875 
loss 0.0811857943010545 learn_rate 1.26877432866187 
loss 0.0811856463716961 learn_rate 1.33221304509496 
loss 0.0811855121888922 learn_rate 1.39882369734971 
loss 0.0811853944442733 learn_rate 1.4687648822172 
loss 0.0811852961967494 learn_rate 1.54220312632806 
loss 0.0811852209292816 learn_rate 1.61931328264446 
loss 0.0811851726143154 learn_rate 1.70027894677668 
loss 0.0811851557923238 learn_rate 1.78529289411552 
loss 0.0811851756641748 learn_rate 0.892646447057759 
loss 0.0811844163654062 learn_rate 0.937278769410647 
loss 0.0811842309649654 learn_rate 0.984142707881179 
loss 0.0811840521487905 learn_rate 1.03334984327524 
loss 0.0811838777358565 learn_rate 1.085017335439 
loss 0.0811837092135645 learn_rate 1.13926820221095 
loss 0.0811835481834738 learn_rate 1.1962316123215 
loss 0.081183396460361 learn_rate 1.25604319293757 
loss 0.0811832560999518 learn_rate 1.31884535258445 
loss 0.0811831294309704 learn_rate 1.38478762021367 
loss 0.081183019094739 learn_rate 1.45402700122436 
loss 0.0811829280906123 learn_rate 1.52672835128558 
loss 0.0811828598301078 learn_rate 1.60306476884985 
loss 0.0811828182005051 learn_rate 1.68321800729235 
loss 0.0811828076403568 learn_rate 1.76737890765696 
loss 0.0811828332288286 learn_rate 0.883689453828482 
loss 0.0811820974158562 learn_rate 0.927873926519907 
loss 0.0811819156920553 learn_rate 0.974267622845902 
loss 0.0811817443403636 learn_rate 1.0229810039882 
loss 0.0811815773737949 learn_rate 1.07413005418761 
loss 0.0811814163621308 learn_rate 1.12783655689699 
loss 0.0811812628677701 learn_rate 1.18422838474184 
loss 0.0811811186762663 learn_rate 1.24343980397893 
loss 0.0811809858051728 learn_rate 1.30561179417787 
loss 0.081180866539162 learn_rate 1.37089238388677 
loss 0.0811807634670779 learn_rate 1.43943700308111 
loss 0.0811806795263849 learn_rate 1.51140885323516 
loss 0.0811806180552404 learn_rate 1.58697929589692 
loss 0.0811805828540763 learn_rate 1.66632826069177 
loss 0.0811805782584184 learn_rate 1.74964467372636 
loss 0.081180609225199 learn_rate 0.874822336863178 
loss 0.0811798962971898 learn_rate 0.918563453706337 
loss 0.0811797184396747 learn_rate 0.964491626391654 
loss 0.0811795544167703 learn_rate 1.01271620771124 
loss 0.0811793948373526 learn_rate 1.0633520180968 
loss 0.0811792412478261 learn_rate 1.11651961900164 
loss 0.0811790951817479 learn_rate 1.17234559995172 
loss 0.0811789583913225 learn_rate 1.23096287994931 
loss 0.0811788328547538 learn_rate 1.29251102394677 
loss 0.0811787208106992 learn_rate 1.35713657514411 
loss 0.0811786247939715 learn_rate 1.42499340390131 
loss 0.0811785476786241 learn_rate 1.49624307409638 
loss 0.0811784927282591 learn_rate 1.5710552278012 
loss 0.0811784636555698 learn_rate 1.64960798919126 
loss 0.0811784646926704 learn_rate 0.82480399459563 
loss 0.0811778441332601 learn_rate 0.866044194325411 
loss 0.0811776580996254 learn_rate 0.909346404041682 
loss 0.0811774994684114 learn_rate 0.954813724243766 
loss 0.0811773453875518 learn_rate 1.00255441045595 
loss 0.0811771959593184 learn_rate 1.05268213097875 
loss 0.0811770525252931 learn_rate 1.10531623752769 
loss 0.0811769165943386 learn_rate 1.16058204940407 
loss 0.0811767898722451 learn_rate 1.21861115187428 
loss 0.0811766742870186 learn_rate 1.27954170946799 
loss 0.0811765720192829 learn_rate 1.34351879494139 
loss 0.0811764855376411 learn_rate 1.41069473468846 
loss 0.0811764176402061 learn_rate 1.48122947142288 
loss 0.0811763715034425 learn_rate 1.55529094499403 
loss 0.0811763507397915 learn_rate 1.63305549224373 
loss 0.0811763594657905 learn_rate 0.816527746121865 
loss 0.0811757601960641 learn_rate 0.857354133427959 
loss 0.0811755809998954 learn_rate 0.900221840099356 
loss 0.0811754310828648 learn_rate 0.945232932104324 
loss 0.0811752861235372 learn_rate 0.992494578709541 
loss 0.0811751458658933 learn_rate 1.04211930764502 
loss 0.0811750116051714 learn_rate 1.09422527302727 
loss 0.0811748848135213 learn_rate 1.14893653667863 
loss 0.0811747671538059 learn_rate 1.20638336351256 
loss 0.0811746605050439 learn_rate 1.26670253168819 
loss 0.0811745669917794 learn_rate 1.3300376582726 
loss 0.0811744890183954 learn_rate 1.39653954118623 
loss 0.0811744293093557 learn_rate 1.46636651824554 
loss 0.0811743909565267 learn_rate 1.53968484415782 
loss 0.08117437747498 learn_rate 1.61666908636571 
loss 0.0811743928689377 learn_rate 0.808334543182856 
loss 0.0811738138621852 learn_rate 0.848751270341999 
loss 0.0811736408946194 learn_rate 0.891188833859099 
loss 0.0811734988897706 learn_rate 0.935748275552054 
loss 0.0811733622549039 learn_rate 0.982535689329656 
loss 0.0811732303646987 learn_rate 1.03166247379614 
loss 0.0811731044611925 learn_rate 1.08324559748595 
loss 0.081172985980908 learn_rate 1.13740787736024 
loss 0.0811728765450172 learn_rate 1.19427827122826 
loss 0.0811727779849414 learn_rate 1.25399218478967 
loss 0.0811726923707306 learn_rate 1.31669179402915 
loss 0.0811726220443517 learn_rate 1.38252638373061 
loss 0.0811725696586789 learn_rate 1.45165270291714 
loss 0.0811725382233368 learn_rate 1.524235338063 
loss 0.0811725311587326 learn_rate 1.60044710496615 
loss 0.081172552359892 learn_rate 0.800223552483074 
loss 0.0811719926983774 learn_rate 0.840234730107227 
loss 0.0811718254570407 learn_rate 0.882246466612589 
loss 0.0811716906819249 learn_rate 0.926358789943218 
loss 0.0811715616897648 learn_rate 0.972676729440379 
loss 0.0811714374806431 learn_rate 1.0213105659124 
loss 0.081171319236222 learn_rate 1.07237609420802 
loss 0.0811712083580532 learn_rate 1.12599489891842 
loss 0.0811711064267254 learn_rate 1.18229464386434 
loss 0.0811710152272937 learn_rate 1.24140937605756 
loss 0.0811709367767479 learn_rate 1.30347984486044 
loss 0.081170873356293 learn_rate 1.36865383710346 
loss 0.0811708275491286 learn_rate 1.43708652895863 
loss 0.0811708022848532 learn_rate 1.50894085540656 
loss 0.081170800891783 learn_rate 1.58438789817689 
loss 0.0811708271587408 learn_rate 0.792193949088445 
loss 0.0811702860068233 learn_rate 0.831803646542867 
loss 0.0811701240662688 learn_rate 0.873393828870011 
loss 0.0811699959277428 learn_rate 0.917063520313511 
loss 0.0811698739823319 learn_rate 0.962916696329187 
loss 0.0811697568551151 learn_rate 1.01106253114565 
loss 0.0811696456602028 learn_rate 1.06161565770293 
loss 0.0811695417644546 learn_rate 1.11469644058808 
loss 0.0811694467089002 learn_rate 1.17043126261748 
loss 0.0811693622333579 learn_rate 1.22895282574835 
loss 0.081169290303089 learn_rate 1.29040046703577 
loss 0.0811692331400864 learn_rate 1.35492049038756 
loss 0.0811691932596823 learn_rate 1.42266651490694 
loss 0.0811691735135587 learn_rate 1.49379984065228 
loss 0.0811691771404033 learn_rate 0.746899920326142 
loss 0.0811687071405731 learn_rate 0.784244916342449 
loss 0.0811685443929882 learn_rate 0.823457162159572 
loss 0.081168419397178 learn_rate 0.86463002026755 
loss 0.0811683019209534 learn_rate 0.907861521280928 
loss 0.0811681885883786 learn_rate 0.953254597344974 
loss 0.0811680799956871 learn_rate 1.00091732721222 
loss 0.0811679772945461 learn_rate 1.05096319357283 
loss 0.0811678818105466 learn_rate 1.10351135325148 
loss 0.0811677950395643 learn_rate 1.15868692091405 
loss 0.0811677186702069 learn_rate 1.21662126695975 
loss 0.0811676546098123 learn_rate 1.27745233030774 
loss 0.0811676050147874 learn_rate 1.34132494682313 
loss 0.0811675723261261 learn_rate 1.40839119416428 
loss 0.081167559311118 learn_rate 1.4788107538725 
loss 0.0811675691124622 learn_rate 0.739405376936249 
loss 0.0811671158131104 learn_rate 0.776375645783061 
loss 0.0811669595536296 learn_rate 0.815194428072214 
loss 0.0811668416145509 learn_rate 0.855954149475825 
loss 0.0811667315983504 learn_rate 0.898751856949616 
loss 0.0811666258254436 learn_rate 0.943689449797097 
loss 0.0811665247854668 learn_rate 0.990873922286952 
loss 0.0811664295868831 learn_rate 1.0404176184013 
loss 0.0811663415170981 learn_rate 1.09243849932136 
loss 0.0811662620291082 learn_rate 1.14706042428743 
loss 0.0811661927629126 learn_rate 1.2044134455018 
loss 0.0811661355707569 learn_rate 1.26463411777689 
loss 0.0811660925465624 learn_rate 1.32786582366574 
loss 0.0811660660603685 learn_rate 1.39425911484903 
loss 0.081166058798772 learn_rate 1.46397207059148 
loss 0.0811660738125261 learn_rate 0.731986035295739 
loss 0.0811656362878182 learn_rate 0.768585337060526 
loss 0.0811654858543018 learn_rate 0.807014603913552 
loss 0.0811653742025992 learn_rate 0.84736533410923 
loss 0.081165270853209 learn_rate 0.889733600814692 
 training_time 00:00:00.1184690 
memory 0
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [0s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [2s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.12
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 5 ratings, sparsity 44.44444
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=5 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.14862 MAE 2.96525 NMAE 0.74131 testing_time 00:00:00.0030520
predicting_time 00:00:00.0036600
memory 0
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [0s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [2s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.03
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 4 ratings, sparsity 55.55556
Load model from model.txt
Set num_factors to 3
BiasedMatrixFactorization num_factors=5 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 2.02961 MAE 1.20658 NMAE 0.30164 testing_time 00:00:00.0025890
predicting_time 00:00:00.0034350
memory 0
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [0s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [2s]


real	0m9.422s
user	0m6.900s
sys	0m2.144s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.