ServerRun 14102
Creatorzenogantner
ProgramMyMediaLite-biased-matrix-factorization-k-10
Datasetcollaborativefiltering-sample
Task typeCollaborativeFiltering
Created6y45d ago
DownloadLogin required!
Done! Flag_green
4s
423M
CollaborativeFiltering
0.064
0.044
2.27
1.78

Log file

... (lines omitted) ...
loss 0.0831649905619739 learn_rate 2.62405717366263 
loss 0.0831571986186588 learn_rate 2.75526003234576 
loss 0.0831578734403 learn_rate 1.37763001617288 
loss 0.0821068576239632 learn_rate 1.44651151698152 
loss 0.0824774882694632 learn_rate 0.723255758490761 
loss 0.0821237794665377 learn_rate 0.759418546415299 
loss 0.0820737850881069 learn_rate 0.797389473736064 
loss 0.0820581621074318 learn_rate 0.837258947422868 
loss 0.0820479667571212 learn_rate 0.879121894794011 
loss 0.0820386989999564 learn_rate 0.923077989533712 
loss 0.082029775288337 learn_rate 0.969231889010397 
loss 0.0820212077390237 learn_rate 1.01769348346092 
loss 0.082013076273292 learn_rate 1.06857815763396 
loss 0.082005472672965 learn_rate 1.12200706551566 
loss 0.081998498821647 learn_rate 1.17810741879144 
loss 0.0819922677243421 learn_rate 1.23701278973102 
loss 0.0819869045805784 learn_rate 1.29886342921757 
loss 0.0819825479676817 learn_rate 1.36380660067845 
loss 0.0819793511399675 learn_rate 1.43199693071237 
loss 0.0819774834563472 learn_rate 1.50359677724799 
loss 0.0819771319487187 learn_rate 1.57877661611039 
loss 0.0819785030450108 learn_rate 0.789388308055193 
loss 0.081782302056245 learn_rate 0.828857723457953 
loss 0.0817619780857164 learn_rate 0.870300609630851 
loss 0.0817569720305187 learn_rate 0.913815640112393 
loss 0.0817536998159417 learn_rate 0.959506422118013 
loss 0.0817509203985479 learn_rate 1.00748174322391 
loss 0.0817486405215068 learn_rate 1.05785583038511 
loss 0.081746941693518 learn_rate 1.11074862190437 
loss 0.0817459141913287 learn_rate 1.16628605299958 
loss 0.0817456582966983 learn_rate 1.22460035564956 
loss 0.0817462852889768 learn_rate 0.612300177824781 
loss 0.081657210965375 learn_rate 0.642915186716021 
loss 0.0816342213475542 learn_rate 0.675060946051822 
loss 0.0816263663621526 learn_rate 0.708813993354413 
loss 0.0816227668338292 learn_rate 0.744254693022133 
loss 0.0816205012892693 learn_rate 0.78146742767324 
loss 0.081618760094331 learn_rate 0.820540799056902 
loss 0.0816173656048857 learn_rate 0.861567839009747 
loss 0.081616318609282 learn_rate 0.904646230960235 
loss 0.0816156656056231 learn_rate 0.949878542508247 
loss 0.0816154665011516 learn_rate 0.997372469633659 
loss 0.0816157889842197 learn_rate 0.49868623481683 
loss 0.0815691536041544 learn_rate 0.523620546557671 
loss 0.0815517705378744 learn_rate 0.549801573885555 
loss 0.0815441760476522 learn_rate 0.577291652579832 
loss 0.0815403223650677 learn_rate 0.606156235208824 
loss 0.0815380493861744 learn_rate 0.636464046969265 
loss 0.0815365008452016 learn_rate 0.668287249317728 
loss 0.0815353378695747 learn_rate 0.701701611783615 
loss 0.0815344452133614 learn_rate 0.736786692372796 
loss 0.0815338036765638 learn_rate 0.773626026991435 
loss 0.0815334336835578 learn_rate 0.812307328341007 
loss 0.0815333722086624 learn_rate 0.852922694758058 
loss 0.0815336646147954 learn_rate 0.426461347379029 
loss 0.0815054703636628 learn_rate 0.44778441474798 
loss 0.0814927157316787 learn_rate 0.470173635485379 
loss 0.0814863097631957 learn_rate 0.493682317259648 
loss 0.0814827567739743 learn_rate 0.518366433122631 
loss 0.0814806018231515 learn_rate 0.544284754778762 
loss 0.0814791818378457 learn_rate 0.571498992517701 
loss 0.0814781795948893 learn_rate 0.600073942143586 
loss 0.0814774480745864 learn_rate 0.630077639250765 
loss 0.0814769294659134 learn_rate 0.661581521213303 
loss 0.0814766129642488 learn_rate 0.694660597273968 
loss 0.0814765121412724 learn_rate 0.729393627137667 
loss 0.0814766533871182 learn_rate 0.364696813568833 
loss 0.0814594750205471 learn_rate 0.382931654247275 
loss 0.0814504189121563 learn_rate 0.402078236959639 
loss 0.0814452909173429 learn_rate 0.422182148807621 
loss 0.0814421733153852 learn_rate 0.443291256248002 
loss 0.0814401560233571 learn_rate 0.465455819060402 
loss 0.0814387803337806 learn_rate 0.488728610013422 
loss 0.0814378017021568 learn_rate 0.513165040514093 
loss 0.0814370874201433 learn_rate 0.538823292539798 
loss 0.0814365690051173 learn_rate 0.565764457166788 
loss 0.081436217026166 learn_rate 0.594052680025127 
loss 0.0814360261967074 learn_rate 0.623755314026384 
loss 0.0814360061502626 learn_rate 0.654943079727703 
loss 0.0814361758753205 learn_rate 0.327471539863851 
loss 0.081424102771651 learn_rate 0.343845116857044 
loss 0.0814171835436823 learn_rate 0.361037372699896 
loss 0.0814130008920321 learn_rate 0.379089241334891 
loss 0.0814103318725208 learn_rate 0.398043703401636 
loss 0.0814085468585506 learn_rate 0.417945888571717 
loss 0.081407308717347 learn_rate 0.438843183000303 
loss 0.0814064281357204 learn_rate 0.460785342150318 
loss 0.0814057955019656 learn_rate 0.483824609257834 
loss 0.0814053482079329 learn_rate 0.508015839720726 
loss 0.0814050536721185 learn_rate 0.533416631706762 
loss 0.0814048994047372 learn_rate 0.5600874632921 
loss 0.0814048865828507 learn_rate 0.588091836456705 
loss 0.0814050257442193 learn_rate 0.294045918228353 
loss 0.0813965014623732 learn_rate 0.30874821413977 
loss 0.0813912441649911 learn_rate 0.324185624846759 
loss 0.0813878682155421 learn_rate 0.340394906089097 
loss 0.0813856073718596 learn_rate 0.357414651393552 
loss 0.0813840364045976 learn_rate 0.375285383963229 
loss 0.0813829142464052 learn_rate 0.394049653161391 
loss 0.0813820994669968 learn_rate 0.41375213581946 
loss 0.0813815064149587 learn_rate 0.434439742610433 
loss 0.0813810828745553 learn_rate 0.456161729740955 
loss 0.0813807984360651 learn_rate 0.478969816228003 
loss 0.0813806379545732 learn_rate 0.502918307039403 
loss 0.0813805974225046 learn_rate 0.528064222391373 
loss 0.0813806811053754 learn_rate 0.264032111195687 
loss 0.081374637727282 learn_rate 0.277233716755471 
loss 0.0813706609700684 learn_rate 0.291095402593245 
loss 0.0813679616960101 learn_rate 0.305650172722907 
loss 0.0813660679033316 learn_rate 0.320932681359052 
loss 0.0813646990065883 learn_rate 0.336979315427005 
loss 0.0813636870230928 learn_rate 0.353828281198355 
loss 0.0813629293552008 learn_rate 0.371519695258273 
loss 0.0813623617404956 learn_rate 0.390095680021186 
loss 0.0813619433403919 learn_rate 0.409600464022246 
loss 0.0813616486846848 learn_rate 0.430080487223358 
loss 0.0813614632411897 learn_rate 0.451584511584526 
loss 0.0813613807924983 learn_rate 0.474163737163752 
loss 0.0813614016938028 learn_rate 0.237081868581876 
loss 0.0813570991955079 learn_rate 0.24893596201097 
loss 0.0813541004055593 learn_rate 0.261382760111519 
loss 0.0813519590856516 learn_rate 0.274451898117094 
loss 0.0813503891689214 learn_rate 0.288174493022949 
loss 0.0813492095998591 learn_rate 0.302583217674097 
loss 0.0813483062116626 learn_rate 0.317712378557802 
loss 0.0813476065257462 learn_rate 0.333597997485692 
loss 0.081347063852119 learn_rate 0.350277897359976 
loss 0.0813466477187191 learn_rate 0.367791792227975 
loss 0.0813463383522181 learn_rate 0.386181381839374 
loss 0.0813461235837152 learn_rate 0.405490450931343 
loss 0.0813459971026643 learn_rate 0.42576497347791 
loss 0.08134595740559 learn_rate 0.447053222151805 
loss 0.0813460070832502 learn_rate 0.223526611075903 
loss 0.0813424825779848 learn_rate 0.234702941629698 
loss 0.0813399627165228 learn_rate 0.246438088711183 
loss 0.0813381246517729 learn_rate 0.258759993146742 
loss 0.0813367541710083 learn_rate 0.271697992804079 
loss 0.0813357113331709 learn_rate 0.285282892444283 
loss 0.0813349054900013 learn_rate 0.299547037066497 
loss 0.0813342779085806 learn_rate 0.314524388919822 
loss 0.0813337902325268 learn_rate 0.330250608365813 
loss 0.0813334171853628 learn_rate 0.346763138784104 
loss 0.0813331421778182 learn_rate 0.364101295723309 
loss 0.0813329547770616 learn_rate 0.382306360509474 
loss 0.0813328492852812 learn_rate 0.401421678534948 
loss 0.0813328239267378 learn_rate 0.421492762461695 
loss 0.0813328803405153 learn_rate 0.210746381230848 
loss 0.0813299851080338 learn_rate 0.22128370029239 
loss 0.081327864734335 learn_rate 0.23234788530701 
loss 0.0813262857891963 learn_rate 0.24396527957236 
loss 0.0813250883231103 learn_rate 0.256163543550978 
loss 0.0813241646685837 learn_rate 0.268971720728527 
loss 0.0813234432539637 learn_rate 0.282420306764953 
loss 0.0813228767700993 learn_rate 0.296541322103201 
loss 0.0813224339105537 learn_rate 0.311368388208361 
loss 0.0813220938846495 learn_rate 0.326936807618779 
loss 0.0813218429591326 learn_rate 0.343283647999718 
loss 0.081321672392687 learn_rate 0.360447830399704 
loss 0.0813215772612795 learn_rate 0.378470221919689 
loss 0.0813215558084263 learn_rate 0.397393733015674 
loss 0.0813216090762758 learn_rate 0.198696866507837 
loss 0.081319225709359 learn_rate 0.208631709833229 
loss 0.0813174399680697 learn_rate 0.21906329532489 
loss 0.0813160833468946 learn_rate 0.230016460091135 
loss 0.0813150368527686 learn_rate 0.241517283095691 
loss 0.0813142180902359 learn_rate 0.253593147250476 
loss 0.0813135709101066 learn_rate 0.266272804613 
loss 0.0813130574647136 learn_rate 0.27958644484365 
loss 0.0813126523841701 learn_rate 0.293565767085832 
loss 0.0813123387136349 learn_rate 0.308244055440124 
loss 0.081312105228136 learn_rate 0.32365625821213 
loss 0.0813119447599426 learn_rate 0.339839071122737 
loss 0.0813118532211745 learn_rate 0.356831024678874 
loss 0.0813118290679396 learn_rate 0.374672575912817 
loss 0.0813118730197252 learn_rate 0.187336287956409 
loss 0.0813099068653303 learn_rate 0.196703102354229 
loss 0.0813084015780897 learn_rate 0.206538257471941 
loss 0.0813072357381892 learn_rate 0.216865170345538 
loss 0.0813063211500835 learn_rate 0.227708428862814 
loss 0.0813055950929736 learn_rate 0.239093850305955 
loss 0.0813050137988885 learn_rate 0.251048542821253 
loss 0.0813045471994369 learn_rate 0.263600969962316 
loss 0.0813041748843206 learn_rate 0.276781018460431 
loss 0.0813038831416895 learn_rate 0.290620069383453 
loss 0.0813036629053717 learn_rate 0.305151072852626 
loss 0.0813035084161042 learn_rate 0.320408626495257 
loss 0.0813034164089449 learn_rate 0.33642905782002 
loss 0.0813033856609388 learn_rate 0.353250510711021 
loss 0.0813034167648418 learn_rate 0.17662525535551 
loss 0.0813017912601968 learn_rate 0.185456518123286 
loss 0.0813005210760309 learn_rate 0.19472934402945 
loss 0.081299518891656 learn_rate 0.204465811230923 
loss 0.081298719581554 learn_rate 0.214689101792469 
loss 0.0812980756526878 learn_rate 0.225423556882092 
loss 0.0812975532062027 learn_rate 0.236694734726197 
loss 0.0812971285219137 learn_rate 0.248529471462507 
loss 0.0812967853006883 learn_rate 0.260955945035632 
loss 0.0812965125450022 learn_rate 0.274003742287414 
loss 0.0812963030157913 learn_rate 0.287703929401785 
loss 0.0812961521764217 learn_rate 0.302089125871874 
loss 0.0812960575221211 learn_rate 0.317193582165468 
loss 0.0812960181937312 learn_rate 0.333053261273741 
loss 0.0812960347850472 learn_rate 0.16652663063687 
loss 0.081294687842495 learn_rate 0.174852962168714 
loss 0.0812936147594395 learn_rate 0.18359561027715 
loss 0.081292752869385 learn_rate 0.192775390791007 
loss 0.081292054261911 learn_rate 0.202414160330558 
loss 0.0812914831618642 learn_rate 0.212534868347085 
loss 0.0812910134811238 learn_rate 0.22316161176444 
loss 0.0812906266402941 learn_rate 0.234319692352662 
loss 0.0812903097215454 learn_rate 0.246035676970295 
loss 0.0812900539782911 learn_rate 0.25833746081881 
loss 0.0812898536958444 learn_rate 0.27125433385975 
loss 0.0812897053723002 learn_rate 0.284817050552738 
loss 0.0812896071720704 learn_rate 0.299057903080374 
loss 0.0812895585960607 learn_rate 0.314010798234393 
loss 0.081289560311594 learn_rate 0.157005399117197 
loss 0.0812884414837048 learn_rate 0.164855669073056 
loss 0.0812875336468331 learn_rate 0.173098452526709 
loss 0.0812867919283977 learn_rate 0.181753375153045 
loss 0.0812861812149821 learn_rate 0.190841043910697 
loss 0.0812856746972687 learn_rate 0.200383096106232 
loss 0.0812852524251973 learn_rate 0.210402250911543 
loss 0.0812848999502433 learn_rate 0.220922363457121 
loss 0.081284607113681 learn_rate 0.231968481629977 
loss 0.0812843670192514 learn_rate 0.243566905711475 
loss 0.0812841752080903 learn_rate 0.255745250997049 
loss 0.0812840290351109 learn_rate 0.268532513546902 
loss 0.0812839272309096 learn_rate 0.281959139224247 
loss 0.0812838696226962 learn_rate 0.296057096185459 
loss 0.0812838569820504 learn_rate 0.310859950994732 
loss 0.0812838909661485 learn_rate 0.155429975497366 
loss 0.0812828311894394 learn_rate 0.163201474272234 
loss 0.0812819711703741 learn_rate 0.171361547985846 
loss 0.0812812691405868 learn_rate 0.179929625385138 
loss 0.0812806922992609 learn_rate 0.188926106654395 
loss 0.0812802155207093 learn_rate 0.198372411987115 
loss 0.0812798200561015 learn_rate 0.208291032586471 
loss 0.0812794923012962 learn_rate 0.218705584215794 
loss 0.0812792226871944 learn_rate 0.229640863426584 
loss 0.0812790047309089 learn_rate 0.241122906597913 
loss 0.081278834267382 learn_rate 0.253179051927809 
loss 0.0812787088637915 learn_rate 0.265838004524199 
loss 0.0812786274047124 learn_rate 0.279129904750409 
loss 0.0812785898256012 learn_rate 0.29308639998793 
loss 0.0812785969661553 learn_rate 0.146543199993965 
loss 0.0812777100528227 learn_rate 0.153870359993663 
loss 0.081276978006657 learn_rate 0.161563877993346 
loss 0.0812763707936303 learn_rate 0.169642071893014 
loss 0.0812758643114721 learn_rate 0.178124175487664 
loss 0.0812754397105478 learn_rate 0.187030384262048 
loss 0.0812750826558244 learn_rate 0.19638190347515 
loss 0.0812747825799558 learn_rate 0.206200998648907 
loss 0.0812745319715096 learn_rate 0.216511048581353 
loss 0.0812743257334935 learn_rate 0.227336601010421 
loss 0.0812741606367043 learn_rate 0.238703431060942 
loss 0.0812740348812 learn_rate 0.250638602613989 
loss 0.0812739477685709 learn_rate 0.263170532744688 
loss 0.0812738994786645 learn_rate 0.276329059381922 
loss 0.0812738909376764 learn_rate 0.290145512351019 
loss 0.081273923760446 learn_rate 0.145072756175509 
loss 0.0812730832448652 learn_rate 0.152326393984285 
loss 0.0812723892188082 learn_rate 0.159942713683499 
loss 0.0812718137785843 learn_rate 0.167939849367674 
loss 0.0812713344596091 learn_rate 0.176336841836058 
loss 0.0812709336420035 learn_rate 0.185153683927861 
loss 0.0812705978941738 learn_rate 0.194411368124254 
loss 0.0812703173003025 learn_rate 0.204131936530466 
loss 0.0812700848128545 learn_rate 0.21433853335699 
loss 0.0812698956636499 learn_rate 0.225055460024839 
loss 0.0812697468576361 learn_rate 0.236308233026081 
loss 0.0812696367633685 learn_rate 0.248123644677385 
loss 0.0812695648043699 learn_rate 0.260529826911254 
loss 0.0812695312470185 learn_rate 0.273556318256817 
loss 0.0812695370740364 learn_rate 0.136778159128409 
loss 0.0812688306301515 learn_rate 0.143617067084829 
loss 0.0812682379716168 learn_rate 0.15079792043907 
loss 0.0812677390526984 learn_rate 0.158337816461024 
loss 0.081267317405591 learn_rate 0.166254707284075 
loss 0.0812669598618433 learn_rate 0.174567442648279 
loss 0.0812666562042632 learn_rate 0.183295814780693 
loss 0.0812663987783821 learn_rate 0.192460605519728 
loss 0.0812661820920813 learn_rate 0.202083635795714 
loss 0.0812660024293456 learn_rate 0.2121878175855 
loss 0.0812658574996831 learn_rate 0.222797208464775 
loss 0.0812657461390433 learn_rate 0.233937068888013 
loss 0.0812656680717686 learn_rate 0.245633922332414 
loss 0.0812656237368755 learn_rate 0.257915618449035 
loss 0.0812656141764371 learn_rate 0.270811399371487 
loss 0.0812656409794639 learn_rate 0.135405699685743 
loss 0.0812649714903203 learn_rate 0.14217598467003 
loss 0.0812644094751992 learn_rate 0.149284783903532 
loss 0.0812639363592136 learn_rate 0.156749023098709 
loss 0.0812635368394534 learn_rate 0.164586474253644 
loss 0.081263198648452 learn_rate 0.172815797966326 
loss 0.081262912249968 learn_rate 0.181456587864643 
loss 0.0812626704933226 learn_rate 0.190529417257875 
loss 0.0812624682525356 learn_rate 0.200055888120768 
loss 0.0812623020744603 learn_rate 0.210058682526807 
loss 0.0812621698563989 learn_rate 0.220561616653147 
loss 0.081262070568661 learn_rate 0.231589697485805 
loss 0.0812620040318472 learn_rate 0.243169182360095 
loss 0.0812619707529134 learn_rate 0.2553276414781 
loss 0.0812619718188349 learn_rate 0.12766382073905 
loss 0.0812614064888755 learn_rate 0.134047011776002 
loss 0.0812609248091759 learn_rate 0.140749362364802 
loss 0.08126051345033 learn_rate 0.147786830483043 
loss 0.0812601612210398 learn_rate 0.155176172007195 
loss 0.0812598589866483 learn_rate 0.162934980607554 
loss 0.0812595995287513 learn_rate 0.171081729637932 
loss 0.0812593773610233 learn_rate 0.179635816119829 
loss 0.0812591885179451 learn_rate 0.18861760692582 
loss 0.081259030333335 learn_rate 0.198048487272111 
loss 0.0812589012244975 learn_rate 0.207950911635717 
loss 0.0812588004955571 learn_rate 0.218348457217503 
loss 0.0812587281704085 learn_rate 0.229265880078378 
loss 0.0812586848620798 learn_rate 0.240729174082297 
loss 0.0812586716815421 learn_rate 0.252765632786412 
loss 0.081258690185513 learn_rate 0.126382816393206 
loss 0.0812581546104747 learn_rate 0.132701957212866 
loss 0.0812576978971264 learn_rate 0.139337055073509 
loss 0.081257307734778 learn_rate 0.146303907827185 
loss 0.0812569737620393 learn_rate 0.153619103218544 
loss 0.0812566875044702 learn_rate 0.161300058379471 
loss 0.0812564422564615 learn_rate 0.169365061298445 
loss 0.0812562329207534 learn_rate 0.177833314363367 
loss 0.0812560558206315 learn_rate 0.186724980081535 
loss 0.0812559085002989 learn_rate 0.196061229085612 
loss 0.081255789528145 learn_rate 0.205864290539893 
loss 0.0812556983157676 learn_rate 0.216157505066888 
loss 0.0812556349628888 learn_rate 0.226965380320232 
loss 0.0812556001350224 learn_rate 0.238313649336244 
loss 0.0812555949773111 learn_rate 0.250229331803056 
loss 0.0812556210646712 learn_rate 0.125114665901528 
loss 0.0812551086687825 learn_rate 0.131370399196604 
loss 0.081254671553087 learn_rate 0.137938919156434 
loss 0.0812542981860454 learn_rate 0.144835865114256 
loss 0.0812539788349594 learn_rate 0.152077658369969 
loss 0.0812537055197445 learn_rate 0.159681541288467 
loss 0.0812534719133603 learn_rate 0.167665618352891 
loss 0.0812532732008723 learn_rate 0.176048899270535 
loss 0.0812531059108289 learn_rate 0.184851344234062 
loss 0.0812529677332947 learn_rate 0.194093911445765 
loss 0.0812528573383716 learn_rate 0.203798607018054 
loss 0.0812527742075087 learn_rate 0.213988537368956 
loss 0.0812527184875209 learn_rate 0.224687964237404 
loss 0.0812526908742697 learn_rate 0.235922362449274 
loss 0.0812526925297794 learn_rate 0.117961181224637 
loss 0.0812522573426768 learn_rate 0.123859240285869 
loss 0.0812518809429429 learn_rate 0.130052202300162 
loss 0.0812515550663058 learn_rate 0.136554812415171 
loss 0.0812512726133169 learn_rate 0.143382553035929 
loss 0.0812510276650416 learn_rate 0.150551680687726 
loss 0.0812508154576718 learn_rate 0.158079264722112 
loss 0.0812506323219665 learn_rate 0.165983227958217 
loss 0.0812504755953112 learn_rate 0.174282389356128 
loss 0.0812503435154379 learn_rate 0.182996508823935 
loss 0.0812502351054054 learn_rate 0.192146334265132 
loss 0.0812501500592111 learn_rate 0.201753650978388 
 training_time 00:00:00.0926230 
memory 0
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [1s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.03
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 5 ratings, sparsity 44.44444
Load model from model.txt
BiasedMatrixFactorization num_factors=10 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.14644 MAE 2.96351 NMAE 0.74088 testing_time 00:00:00.0028200
predicting_time 00:00:00.0034190
memory 0
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [1s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.09
ratings range: [0, 4]
training data: 3 users, 3 items, 5 ratings, sparsity 44.44444
test data:     3 users, 3 items, 4 ratings, sparsity 55.55556
Load model from model.txt
BiasedMatrixFactorization num_factors=10 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 2.03179 MAE 1.21438 NMAE 0.30359 testing_time 00:00:00.0025670
predicting_time 00:00:00.0032220
memory 0
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [1s]


real	0m9.025s
user	0m6.788s
sys	0m1.904s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.