ServerRun 14857
Creatorinternal
ProgramMyMediaLite-biased-matrix-factorization-k-60
Datasetmovielens100k
Task typeCollaborativeFiltering
Created5y288d ago
DownloadLogin required!
Done! Flag_green
8m31s
423M
CollaborativeFiltering
0.449
0.351
0.954
0.749

Log file

... (lines omitted) ...
loss 44678.4464634554 learn_rate 0.00364232117678914 
loss 44676.1929514347 learn_rate 0.0038244372356286 
loss 44674.445478833 learn_rate 0.00401565909741003 
loss 44673.0210344403 learn_rate 0.00421644205228053 
loss 44671.8343263797 learn_rate 0.00442726415489456 
loss 44670.8446701931 learn_rate 0.00464862736263929 
loss 44670.0343698953 learn_rate 0.00488105873077125 
loss 44669.3989725622 learn_rate 0.00512511166730982 
loss 44668.9425467432 learn_rate 0.00538136725067531 
loss 44668.6752852614 learn_rate 0.00565043561320907 
loss 44668.612268363 learn_rate 0.00593295739386953 
loss 44668.7728479065 learn_rate 0.00296647869693476 
loss 44658.1979401123 learn_rate 0.0031148026317815 
loss 44653.4699462973 learn_rate 0.00327054276337058 
loss 44650.62263046 learn_rate 0.00343406990153911 
loss 44648.6081698655 learn_rate 0.00360577339661606 
loss 44647.046527382 learn_rate 0.00378606206644686 
loss 44645.7739495138 learn_rate 0.00397536516976921 
loss 44644.7130741574 learn_rate 0.00417413342825767 
loss 44643.8265115711 learn_rate 0.00438284009967055 
loss 44643.0976629205 learn_rate 0.00460198210465408 
loss 44642.5219630582 learn_rate 0.00483208120988679 
loss 44642.1025933598 learn_rate 0.00507368527038113 
loss 44641.8482859268 learn_rate 0.00532736953390018 
loss 44641.7721807921 learn_rate 0.00559373801059519 
loss 44641.8912501083 learn_rate 0.0027968690052976 
loss 44632.8326473623 learn_rate 0.00293671245556248 
loss 44628.6505928302 learn_rate 0.0030835480783406 
loss 44626.1018198339 learn_rate 0.00323772548225763 
loss 44624.290866042 learn_rate 0.00339961175637051 
loss 44622.8845843052 learn_rate 0.00356959234418904 
loss 44621.736880683 learn_rate 0.00374807196139849 
loss 44620.7777447207 learn_rate 0.00393547555946841 
loss 44619.9728556537 learn_rate 0.00413224933744183 
loss 44619.3066421171 learn_rate 0.00433886180431393 
loss 44618.7744484508 learn_rate 0.00455580489452962 
loss 44618.3786577511 learn_rate 0.0047835951392561 
loss 44618.1266874387 learn_rate 0.00502277489621891 
loss 44618.0299363258 learn_rate 0.00527391364102986 
loss 44618.1032475096 learn_rate 0.00263695682051493 
loss 44610.3406743627 learn_rate 0.00276880466154067 
loss 44606.6390384497 learn_rate 0.00290724489461771 
loss 44604.3521902699 learn_rate 0.00305260713934859 
loss 44602.7174100243 learn_rate 0.00320523749631602 
loss 44601.4436488082 learn_rate 0.00336549937113183 
loss 44600.4009871055 learn_rate 0.00353377433968842 
loss 44599.5262434561 learn_rate 0.00371046305667284 
loss 44598.787978466 learn_rate 0.00389598620950648 
loss 44598.1716044377 learn_rate 0.0040907855199818 
loss 44597.6724115118 learn_rate 0.00429532479598089 
loss 44597.292077594 learn_rate 0.00451009103577994 
loss 44597.0368461155 learn_rate 0.00473559558756894 
loss 44596.9165594876 learn_rate 0.00497237536694738 
loss 44596.9441599146 learn_rate 0.00248618768347369 
loss 44590.2907710682 learn_rate 0.00261049706764738 
loss 44587.0133549072 learn_rate 0.00274102192102975 
loss 44584.9581501118 learn_rate 0.00287807301708123 
loss 44583.4777284115 learn_rate 0.00302197666793529 
loss 44582.3187211631 learn_rate 0.00317307550133206 
loss 44581.365989278 learn_rate 0.00333172927639866 
loss 44580.5626944768 learn_rate 0.0034983157402186 
loss 44579.8801144631 learn_rate 0.00367323152722953 
loss 44579.3046166004 learn_rate 0.003856893103591 
loss 44578.8314805171 learn_rate 0.00404973775877055 
loss 44578.4617717447 learn_rate 0.00425222464670908 
loss 44578.2006928049 learn_rate 0.00446483587904454 
loss 44578.0566999019 learn_rate 0.00468807767299676 
loss 44578.0410411172 learn_rate 0.0049224815566466 
loss 44578.1675408258 learn_rate 0.0024612407783233 
loss 44571.779521932 learn_rate 0.00258430281723947 
loss 44568.6415951006 learn_rate 0.00271351795810144 
loss 44566.6901504427 learn_rate 0.00284919385600651 
loss 44565.2992251066 learn_rate 0.00299165354880684 
loss 44564.2226575251 learn_rate 0.00314123622624718 
loss 44563.3482824073 learn_rate 0.00329829803755954 
loss 44562.6206633722 learn_rate 0.00346321293943751 
loss 44562.0117957216 learn_rate 0.00363637358640939 
loss 44561.5084260566 learn_rate 0.00381819226572986 
loss 44561.1060243401 learn_rate 0.00400910187901635 
loss 44560.8057269769 learn_rate 0.00420955697296717 
loss 44560.6127219093 learn_rate 0.00442003482161553 
loss 44560.5353824338 learn_rate 0.00464103656269631 
loss 44560.5848138725 learn_rate 0.00232051828134815 
loss 44555.091128287 learn_rate 0.00243654419541556 
loss 44552.2973727268 learn_rate 0.00255837140518634 
loss 44550.527090567 learn_rate 0.00268628997544566 
loss 44549.2506963907 learn_rate 0.00282060447421794 
loss 44548.2546076042 learn_rate 0.00296163469792884 
loss 44547.4396877707 learn_rate 0.00310971643282528 
loss 44546.7561425193 learn_rate 0.00326520225446654 
loss 44546.1783736545 learn_rate 0.00342846236718987 
loss 44545.6939438803 learn_rate 0.00359988548554936 
loss 44545.2982669539 learn_rate 0.00377987975982683 
loss 44544.9918844818 learn_rate 0.00396887374781817 
loss 44544.7790121161 learn_rate 0.00416731743520908 
loss 44544.6667515611 learn_rate 0.00437568330696953 
loss 44544.6646728735 learn_rate 0.00459446747231801 
loss 44544.7846147424 learn_rate 0.00229723373615901 
loss 44539.4997427571 learn_rate 0.00241209542296696 
loss 44536.8163287633 learn_rate 0.0025327001941153 
loss 44535.1266529112 learn_rate 0.00265933520382107 
loss 44533.9186174466 learn_rate 0.00279230196401212 
loss 44532.9847375273 learn_rate 0.00293191706221273 
loss 44532.2284486311 learn_rate 0.00307851291532337 
loss 44531.6011814024 learn_rate 0.00323243856108953 
loss 44531.0779634806 learn_rate 0.00339406048914401 
loss 44530.6466827428 learn_rate 0.00356376351360121 
loss 44530.3029093769 learn_rate 0.00374195168928127 
loss 44530.0472351209 learn_rate 0.00392904927374534 
loss 44529.8838504487 learn_rate 0.0041255017374326 
loss 44529.8197723505 learn_rate 0.00433177682430423 
loss 44529.8644345142 learn_rate 0.00216588841215212 
loss 44525.3170780063 learn_rate 0.00227418283275972 
loss 44522.9268424507 learn_rate 0.00238789197439771 
loss 44521.3911565322 learn_rate 0.00250728657311759 
loss 44520.2784904942 learn_rate 0.00263265090177347 
loss 44519.4095938911 learn_rate 0.00276428344686215 
loss 44518.6994931586 learn_rate 0.00290249761920526 
loss 44518.1047571511 learn_rate 0.00304762250016552 
loss 44517.6026645525 learn_rate 0.0032000036251738 
loss 44517.1819103648 learn_rate 0.00336000380643248 
loss 44516.8380717447 learn_rate 0.00352800399675411 
loss 44516.5712526545 learn_rate 0.00370440419659182 
loss 44516.3848113018 learn_rate 0.00388962440642141 
loss 44516.2846623367 learn_rate 0.00408410562674248 
loss 44516.2789020829 learn_rate 0.0042883109080796 
loss 44516.3776255229 learn_rate 0.0021441554540398 
loss 44511.999114728 learn_rate 0.00225136322674179 
loss 44509.6996951523 learn_rate 0.00236393138807888 
loss 44508.2298099399 learn_rate 0.00248212795748282 
loss 44507.1722602683 learn_rate 0.00260623435535697 
loss 44506.3529776892 learn_rate 0.00273654607312481 
loss 44505.6892265558 learn_rate 0.00287337337678106 
loss 44505.1386587583 learn_rate 0.00301704204562011 
loss 44504.6791115337 learn_rate 0.00316789414790111 
loss 44504.2995710707 learn_rate 0.00332628885529617 
loss 44503.9957543533 learn_rate 0.00349260329806098 
loss 44503.767809763 learn_rate 0.00366723346296403 
loss 44503.6190727038 learn_rate 0.00385059513611223 
loss 44503.5553823131 learn_rate 0.00404312489291784 
loss 44503.5847140367 learn_rate 0.00202156244645892 
loss 44499.8164208245 learn_rate 0.00212264056878187 
loss 44497.7685012712 learn_rate 0.00222877259722096 
loss 44496.4315887963 learn_rate 0.00234021122708201 
loss 44495.4555701509 learn_rate 0.00245722178843611 
loss 44494.6907307139 learn_rate 0.00258008287785791 
loss 44494.0645966273 learn_rate 0.00270908702175081 
loss 44493.5394787802 learn_rate 0.00284454137283835 
loss 44493.0953145162 learn_rate 0.00298676844148027 
loss 44492.7218875433 learn_rate 0.00313610686355428 
loss 44492.4149815305 learn_rate 0.003292912206732 
loss 44492.1743575618 learn_rate 0.0034575578170686 
loss 44492.0026486516 learn_rate 0.00363043570792202 
loss 44491.9047467002 learn_rate 0.00381195749331813 
loss 44491.8874691521 learn_rate 0.00400255536798403 
loss 44491.9593932105 learn_rate 0.00200127768399202 
loss 44488.3289964757 learn_rate 0.00210134156819162 
loss 44486.3568427928 learn_rate 0.0022064086466012 
loss 44485.0746300797 learn_rate 0.00231672907893126 
loss 44484.1439978407 learn_rate 0.00243256553287782 
loss 44483.4196327954 learn_rate 0.00255419380952171 
loss 44482.8310068743 learn_rate 0.0026819034999978 
loss 44482.3413885041 learn_rate 0.00281599867499769 
loss 44481.9312147648 learn_rate 0.00295679860874757 
loss 44481.5905319551 learn_rate 0.00310463853918495 
loss 44481.3152506893 learn_rate 0.0032598704661442 
loss 44481.1051732323 learn_rate 0.00342286398945141 
loss 44480.9629135711 learn_rate 0.00359400718892398 
loss 44480.8932976993 learn_rate 0.00377370754837018 
loss 44480.903037133 learn_rate 0.00188685377418509 
loss 44477.7781201295 learn_rate 0.00198119646289434 
loss 44476.0222940286 learn_rate 0.00208025628603906 
loss 44474.8558384314 learn_rate 0.00218426910034101 
loss 44473.995991306 learn_rate 0.00229348255535807 
loss 44473.3183335675 learn_rate 0.00240815668312597 
loss 44472.7613710629 learn_rate 0.00252856451728227 
loss 44472.2925722535 learn_rate 0.00265499274314638 
loss 44471.8943224549 learn_rate 0.0027877423803037 
loss 44471.5574496494 learn_rate 0.00292712949931889 
loss 44471.2779825635 learn_rate 0.00307348597428483 
loss 44471.0554263568 learn_rate 0.00322716027299907 
loss 44470.8918097794 learn_rate 0.00338851828664902 
loss 44470.7911510586 learn_rate 0.00355794420098148 
loss 44470.7591641648 learn_rate 0.00373584141103055 
loss 44470.8031104098 learn_rate 0.00186792070551528 
loss 44467.7914645647 learn_rate 0.00196131674079104 
loss 44466.0993468689 learn_rate 0.00205938257783059 
loss 44464.9788784292 learn_rate 0.00216235170672212 
loss 44464.1569214525 learn_rate 0.00227046929205823 
loss 44463.5127867074 learn_rate 0.00238399275666114 
loss 44462.986667995 learn_rate 0.0025031923944942 
loss 44462.5468789282 learn_rate 0.00262835201421891 
loss 44462.1762519776 learn_rate 0.00275976961492985 
loss 44461.8658527776 learn_rate 0.00289775809567634 
loss 44461.6118264577 learn_rate 0.00304264600046016 
loss 44461.4137175132 learn_rate 0.00319477830048317 
loss 44461.2735399823 learn_rate 0.00335451721550733 
loss 44461.1952555456 learn_rate 0.00352224307628269 
loss 44461.1844861618 learn_rate 0.00369835523009683 
loss 44461.248368736 learn_rate 0.00184917761504841 
loss 44458.336532539 learn_rate 0.00194163649580083 
loss 44456.6978801808 learn_rate 0.00203871832059088 
loss 44455.6145535272 learn_rate 0.00214065423662042 
loss 44454.8225359619 learn_rate 0.00224768694845144 
loss 44454.2046535866 learn_rate 0.00236007129587401 
loss 44453.7026649722 learn_rate 0.00247807486066771 
loss 44453.2856437714 learn_rate 0.0026019786037011 
loss 44452.936804272 learn_rate 0.00273207753388616 
loss 44452.6473965421 learn_rate 0.00286868141058046 
loss 44452.4136373087 learn_rate 0.00301211548110949 
loss 44452.2350720361 learn_rate 0.00316272125516496 
loss 44452.1136669756 learn_rate 0.00332085731792321 
loss 44452.0532987333 learn_rate 0.00348690018381937 
loss 44452.0594727749 learn_rate 0.00174345009190969 
loss 44449.5512267643 learn_rate 0.00183062259650517 
loss 44448.0917118847 learn_rate 0.00192215372633043 
loss 44447.1048682668 learn_rate 0.00201826141264695 
loss 44446.3711009804 learn_rate 0.0021191744832793 
loss 44445.7905987922 learn_rate 0.00222513320744326 
loss 44445.3128488911 learn_rate 0.00233638986781543 
loss 44444.9106125971 learn_rate 0.0024532093612062 
loss 44444.5688672172 learn_rate 0.00257586982926651 
loss 44444.2796089117 learn_rate 0.00270466332072983 
loss 44444.0392122843 learn_rate 0.00283989648676632 
loss 44443.8470085954 learn_rate 0.00298189131110464 
loss 44443.7044908877 learn_rate 0.00313098587665987 
loss 44443.6148633323 learn_rate 0.00328753517049287 
loss 44443.5827904654 learn_rate 0.00345191192901751 
loss 44443.614268642 learn_rate 0.00172595596450875 
loss 44441.1952312558 learn_rate 0.00181225376273419 
loss 44439.7869523924 learn_rate 0.0019028664508709 
loss 44438.8369242979 learn_rate 0.00199800977341445 
loss 44438.133129377 learn_rate 0.00209791026208517 
loss 44437.5788126311 learn_rate 0.00220280577518943 
loss 44437.1248830712 learn_rate 0.0023129460639489 
loss 44436.744828554 learn_rate 0.00242859336714634 
loss 44436.4240170539 learn_rate 0.00255002303550366 
loss 44436.1546551934 learn_rate 0.00267752418727884 
loss 44435.9332221128 learn_rate 0.00281140039664279 
loss 44435.7590856294 learn_rate 0.00295197041647493 
loss 44435.6337275599 learn_rate 0.00309956893729867 
loss 44435.5603039823 learn_rate 0.00325454738416361 
loss 44435.543400305 learn_rate 0.00341727475337179 
loss 44435.5889056032 learn_rate 0.00170863737668589 
loss 44433.2485806529 learn_rate 0.00179406924552019 
loss 44431.8834798399 learn_rate 0.0018837727077962 
loss 44430.9633460018 learn_rate 0.00197796134318601 
loss 44430.2833434052 learn_rate 0.00207685941034531 
loss 44429.7495954474 learn_rate 0.00218070238086257 
loss 44429.3143290047 learn_rate 0.0022897374999057 
loss 44428.9516868455 learn_rate 0.00240422437490099 
loss 44428.6473732712 learn_rate 0.00252443559364604 
loss 44428.3937616938 learn_rate 0.00265065737332834 
loss 44428.1873994606 learn_rate 0.00278319024199476 
loss 44428.0276601264 learn_rate 0.00292234975409449 
loss 44427.9159878829 learn_rate 0.00306846724179922 
loss 44427.8554680628 learn_rate 0.00322189060388918 
loss 44427.8505875754 learn_rate 0.00338298513408364 
loss 44427.907111804 learn_rate 0.00169149256704182 
loss 44425.6410619794 learn_rate 0.00177606719539391 
loss 44424.3161893652 learn_rate 0.00186487055516361 
loss 44423.4234722278 learn_rate 0.00195811408292179 
loss 44422.7649944799 learn_rate 0.00205601978706788 
loss 44422.2496682682 learn_rate 0.00215882077642127 
loss 44421.8310009219 learn_rate 0.00226676181524233 
loss 44421.4837716759 learn_rate 0.00238009990600445 
loss 44421.1940106366 learn_rate 0.00249910490130467 
loss 44420.9542511738 learn_rate 0.00262406014636991 
loss 44420.7611039053 learn_rate 0.0027552631536884 
loss 44420.6139442822 learn_rate 0.00289302631137282 
loss 44420.514175783 learn_rate 0.00303767762694146 
loss 44420.4648105183 learn_rate 0.00318956150828854 
loss 44420.4702349915 learn_rate 0.00159478075414427 
loss 44418.5174299624 learn_rate 0.00167451979185148 
loss 44417.3377520655 learn_rate 0.00175824578144406 
loss 44416.5243677529 learn_rate 0.00184615807051626 
loss 44415.9135446097 learn_rate 0.00193846597404207 
loss 44415.4281474305 learn_rate 0.00203538927274418 
loss 44415.0280995934 learn_rate 0.00213715873638138 
loss 44414.6913315397 learn_rate 0.00224401667320045 
loss 44414.4054314848 learn_rate 0.00235621750686048 
loss 44414.1636312254 learn_rate 0.0024740283822035 
loss 44413.9627329795 learn_rate 0.00259772980131368 
loss 44413.801978487 learn_rate 0.00272761629137936 
loss 44413.6824093515 learn_rate 0.00286399710594833 
loss 44413.6065003035 learn_rate 0.00300719696124574 
loss 44413.5779528661 learn_rate 0.00315755680930803 
loss 44413.6015882885 learn_rate 0.00157877840465402 
loss 44411.7172157975 learn_rate 0.00165771732488672 
loss 44410.5778570987 learn_rate 0.00174060319113105 
loss 44409.7933922653 learn_rate 0.0018276333506876 
loss 44409.205840638 learn_rate 0.00191901501822199 
loss 44408.7404914732 learn_rate 0.00201496576913308 
loss 44408.3584318341 learn_rate 0.00211571405758974 
loss 44408.0382009622 learn_rate 0.00222149976046923 
loss 44407.7677210172 learn_rate 0.00233257474849269 
loss 44407.5404071848 learn_rate 0.00244920348591732 
loss 44407.353154993 learn_rate 0.00257166366021319 
loss 44407.2052409278 learn_rate 0.00270024684322385 
loss 44407.0976999207 learn_rate 0.00283525918538504 
loss 44407.0329680912 learn_rate 0.00297702214465429 
loss 44407.0146815907 learn_rate 0.00312587325188701 
loss 44407.0475721869 learn_rate 0.0015629366259435 
loss 44405.2234521578 learn_rate 0.00164108345724068 
loss 44404.1180637655 learn_rate 0.00172313763010271 
loss 44403.3570911507 learn_rate 0.00180929451160785 
loss 44402.7879986591 learn_rate 0.00189975923718824 
loss 44402.338353618 learn_rate 0.00199474719904765 
loss 44401.9703192078 learn_rate 0.00209448455900004 
loss 44401.6629830912 learn_rate 0.00219920878695004 
loss 44401.4045560032 learn_rate 0.00230916922629754 
loss 44401.1886005336 learn_rate 0.00242462768761242 
loss 44401.0120759677 learn_rate 0.00254585907199304 
loss 44400.8742687282 learn_rate 0.00267315202559269 
loss 44400.7761859812 learn_rate 0.00280680962687232 
loss 44400.7202072229 learn_rate 0.00294715010821594 
loss 44400.7098878782 learn_rate 0.00309450761362674 
loss 44400.7498572434 learn_rate 0.00154725380681337 
loss 44398.9826924429 learn_rate 0.00162461649715404 
loss 44397.9090450727 learn_rate 0.00170584732201174 
loss 44397.1697054143 learn_rate 0.00179113968811233 
loss 44396.6173871979 learn_rate 0.00188069667251794 
loss 44396.1818652377 learn_rate 0.00197473150614384 
loss 44395.8263483227 learn_rate 0.00207346808145103 
loss 44395.5304572047 learn_rate 0.00217714148552358 
loss 44395.2826824164 learn_rate 0.00228599855979976 
loss 44395.0767283373 learn_rate 0.00240029848778975 
loss 44394.9096140632 learn_rate 0.00252031341217924 
loss 44394.780632983 learn_rate 0.0026463290827882 
loss 44394.6907621861 learn_rate 0.00277864553692761 
loss 44394.6423227519 learn_rate 0.00291757781377399 
loss 44394.6387880598 learn_rate 0.00306345670446269 
loss 44394.6846840838 learn_rate 0.00153172835223135 
loss 44392.9720824533 learn_rate 0.00160831476984291 
loss 44391.9287310123 learn_rate 0.00168873050833506 
loss 44391.2098842117 learn_rate 0.00177316703375181 
loss 44390.6733238194 learn_rate 0.0018618253854394 
loss 44390.2509674204 learn_rate 0.00195491665471137 
loss 44389.9070426031 learn_rate 0.00205266248744694 
loss 44389.6216909112 learn_rate 0.00215529561181929 
loss 44389.3836767174 learn_rate 0.00226306039241025 
loss 44389.1868432932 learn_rate 0.00237621341203077 
loss 44389.0282682437 learn_rate 0.0024950240826323 
loss 44388.9072515235 learn_rate 0.00261977528676392 
loss 44388.8247403108 learn_rate 0.00275076405110212 
loss 44388.7829978399 learn_rate 0.00288830225365722 
loss 44388.7854163556 learn_rate 0.00144415112682861 
loss 44387.3089777132 learn_rate 0.00151635868317004 
loss 44386.3805636953 learn_rate 0.00159217661732854 
loss 44385.7259100652 learn_rate 0.00167178544819497 
loss 44385.2280695139 learn_rate 0.00175537472060472 
loss 44384.8297667037 learn_rate 0.00184314345663496 
loss 44384.5003770241 learn_rate 0.0019353006294667 
loss 44384.2226490294 learn_rate 0.00203206566094004 
loss 44383.9866813607 learn_rate 0.00213366894398704 
loss 44383.7869496673 learn_rate 0.00224035239118639 
loss 44383.6207417703 learn_rate 0.00235237001074571 
 training_time 00:08:20.4219120 
memory 3
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [503s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.77
ratings range: [0, 5]
training data: 943 users, 1680 items, 90570 ratings, sparsity 94.28306
test data:     943 users, 1680 items, 90570 ratings, sparsity 94.28306
Load model from model.txt
Set num_factors to 943
BiasedMatrixFactorization num_factors=60 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.56743 MAE 3.46544 NMAE 0.69309 testing_time 00:00:00.1455630
predicting_time 00:00:00.5447650
memory 4
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [3s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [2s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [1s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.48
ratings range: [0, 5]
training data: 943 users, 1680 items, 90570 ratings, sparsity 94.28306
test data:     943 users, 1129 items, 9430 ratings, sparsity 99.11426
Load model from model.txt
Set num_factors to 943
BiasedMatrixFactorization num_factors=60 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.62608 MAE 3.55539 NMAE 0.71108 testing_time 00:00:00.0272780
predicting_time 00:00:00.0324240
memory 3
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [2s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [1s]


real	8m35.253s
user	5m36.993s
sys	0m1.628s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.