ServerRun 14872
Creatorinternal
ProgramMyMediaLite-biased-matrix-factorization-k-40
Datasetmovielens100k
Task typeCollaborativeFiltering
Created6y40d ago
DownloadLogin required!
Done! Flag_green
6m29s
414M
CollaborativeFiltering
0.488
0.383
0.973
0.763

Log file

... (lines omitted) ...
loss 46091.5787367819 learn_rate 0.00364232117678914 
loss 46090.0595659891 learn_rate 0.0038244372356286 
loss 46088.797800186 learn_rate 0.00401565909741003 
loss 46087.7445476227 learn_rate 0.00421644205228053 
loss 46086.8778787055 learn_rate 0.00442726415489456 
loss 46086.1913906646 learn_rate 0.00464862736263929 
loss 46085.6885556127 learn_rate 0.00488105873077125 
loss 46085.3797809368 learn_rate 0.00512511166730982 
loss 46085.2808486415 learn_rate 0.00538136725067531 
loss 46085.4121044365 learn_rate 0.00269068362533765 
loss 46073.2346744095 learn_rate 0.00282521780660454 
loss 46067.9226853323 learn_rate 0.00296647869693476 
loss 46064.7670180361 learn_rate 0.0031148026317815 
loss 46062.5643247135 learn_rate 0.00327054276337058 
loss 46060.8748569007 learn_rate 0.00343406990153911 
loss 46059.5074298539 learn_rate 0.00360577339661606 
loss 46058.3706736762 learn_rate 0.00378606206644686 
loss 46057.4196364332 learn_rate 0.00397536516976921 
loss 46056.633653232 learn_rate 0.00417413342825767 
loss 46056.0061459364 learn_rate 0.00438284009967055 
loss 46055.5395379294 learn_rate 0.00460198210465408 
loss 46055.2425840816 learn_rate 0.00483208120988679 
loss 46055.1289368771 learn_rate 0.00507368527038113 
loss 46055.2163896459 learn_rate 0.00253684263519056 
loss 46044.8121955441 learn_rate 0.00266368476695009 
loss 46040.1090199947 learn_rate 0.0027968690052976 
loss 46037.2747433546 learn_rate 0.00293671245556248 
loss 46035.2840751851 learn_rate 0.0030835480783406 
loss 46033.7526157938 learn_rate 0.00323772548225763 
loss 46032.5102277715 learn_rate 0.00339961175637051 
loss 46031.4744146102 learn_rate 0.00356959234418904 
loss 46030.6039649734 learn_rate 0.00374807196139849 
loss 46029.8794724575 learn_rate 0.00393547555946841 
loss 46029.2942698069 learn_rate 0.00413224933744183 
loss 46028.8498727686 learn_rate 0.00433886180431393 
loss 46028.5535672962 learn_rate 0.00455580489452962 
loss 46028.4171001124 learn_rate 0.0047835951392561 
loss 46028.4559775545 learn_rate 0.00239179756962805 
loss 46019.5623152223 learn_rate 0.00251138744810946 
loss 46015.3962088837 learn_rate 0.00263695682051493 
loss 46012.8456407191 learn_rate 0.00276880466154067 
loss 46011.0398610641 learn_rate 0.00290724489461771 
loss 46009.6440185602 learn_rate 0.00305260713934859 
loss 46008.5072110717 learn_rate 0.00320523749631602 
loss 46007.5551905769 learn_rate 0.00336549937113183 
loss 46006.7502983624 learn_rate 0.00353377433968842 
loss 46006.0743848967 learn_rate 0.00371046305667284 
loss 46005.5207816435 learn_rate 0.00389598620950648 
loss 46005.0902354971 learn_rate 0.0040907855199818 
loss 46004.78873932 learn_rate 0.00429532479598089 
loss 46004.6263435214 learn_rate 0.00451009103577994 
loss 46004.6165113749 learn_rate 0.00473559558756894 
loss 46004.7757948377 learn_rate 0.00236779779378447 
loss 45996.237251159 learn_rate 0.00248618768347369 
loss 45992.2468345612 learn_rate 0.00261049706764738 
loss 45989.8239202879 learn_rate 0.00274102192102975 
loss 45988.1271521966 learn_rate 0.00287807301708123 
loss 45986.8314895738 learn_rate 0.00302197666793529 
loss 45985.790045577 learn_rate 0.00317307550133206 
loss 45984.9304460759 learn_rate 0.00333172927639866 
loss 45984.215976731 learn_rate 0.0034983157402186 
loss 45983.6289708452 learn_rate 0.00367323152722953 
loss 45983.1629852788 learn_rate 0.003856893103591 
loss 45982.8188326773 learn_rate 0.00404973775877055 
loss 45982.60246178 learn_rate 0.00425222464670908 
loss 45982.5237949795 learn_rate 0.00446483587904454 
loss 45982.5960969708 learn_rate 0.00223241793952227 
loss 45975.2744969637 learn_rate 0.00234403883649838 
loss 45971.7225282774 learn_rate 0.0024612407783233 
loss 45969.5234830894 learn_rate 0.00258430281723947 
loss 45967.9649810817 learn_rate 0.00271351795810144 
loss 45966.7647757885 learn_rate 0.00284919385600651 
loss 45965.7929312489 learn_rate 0.00299165354880684 
loss 45964.9844203469 learn_rate 0.00314123622624718 
loss 45964.3057089683 learn_rate 0.00329829803755954 
loss 45963.7402539092 learn_rate 0.00346321293943751 
loss 45963.2816002855 learn_rate 0.00363637358640939 
loss 45962.9298544165 learn_rate 0.00381819226572986 
loss 45962.6897865555 learn_rate 0.00400910187901635 
loss 45962.5697818184 learn_rate 0.00420955697296717 
loss 45962.5812637042 learn_rate 0.00210477848648359 
loss 45956.3134009114 learn_rate 0.00221001741080776 
loss 45953.1623379197 learn_rate 0.00232051828134815 
loss 45951.1743162321 learn_rate 0.00243654419541556 
loss 45949.7485281017 learn_rate 0.00255837140518634 
loss 45948.6409785302 learn_rate 0.00268628997544566 
loss 45947.7372902462 learn_rate 0.00282060447421794 
loss 45946.9793644995 learn_rate 0.00296163469792884 
loss 45946.3367883259 learn_rate 0.00310971643282528 
loss 45945.7942350395 learn_rate 0.00326520225446654 
loss 45945.3454008416 learn_rate 0.00342846236718987 
loss 45944.9898830652 learn_rate 0.00359988548554936 
loss 45944.7314909436 learn_rate 0.00377987975982683 
loss 45944.5773055283 learn_rate 0.00396887374781817 
loss 45944.5371584282 learn_rate 0.00416731743520908 
loss 45944.6233598729 learn_rate 0.00208365871760454 
loss 45938.5921688007 learn_rate 0.00218784165348477 
loss 45935.5633461771 learn_rate 0.00229723373615901 
loss 45933.6630442185 learn_rate 0.00241209542296696 
loss 45932.3105993636 learn_rate 0.0025327001941153 
loss 45931.2692103723 learn_rate 0.00265933520382107 
loss 45930.4275807043 learn_rate 0.00279230196401212 
loss 45929.7290938919 learn_rate 0.00293191706221273 
loss 45929.1441016151 learn_rate 0.00307851291532337 
loss 45928.6576772988 learn_rate 0.00323243856108953 
loss 45928.2637101522 learn_rate 0.00339406048914401 
loss 45927.9618602174 learn_rate 0.00356376351360121 
loss 45927.7559091786 learn_rate 0.00374195168928127 
loss 45927.6528419098 learn_rate 0.00392904927374534 
loss 45927.6623368396 learn_rate 0.00196452463687267 
loss 45922.4874018618 learn_rate 0.0020627508687163 
loss 45919.7916382792 learn_rate 0.00216588841215212 
loss 45918.0640074022 learn_rate 0.00227418283275972 
loss 45916.8165113826 learn_rate 0.00238789197439771 
loss 45915.8451800164 learn_rate 0.00250728657311759 
loss 45915.0523596001 learn_rate 0.00263265090177347 
loss 45914.3875923327 learn_rate 0.00276428344686215 
loss 45913.8240327636 learn_rate 0.00290249761920526 
loss 45913.3478623904 learn_rate 0.00304762250016552 
loss 45912.9531241038 learn_rate 0.0032000036251738 
loss 45912.6390371721 learn_rate 0.00336000380643248 
loss 45912.408534562 learn_rate 0.00352800399675411 
loss 45912.2674439271 learn_rate 0.00370440419659182 
loss 45912.2240298386 learn_rate 0.00388962440642141 
loss 45912.2887517761 learn_rate 0.0019448122032107 
loss 45907.3025445662 learn_rate 0.00204205281337124 
loss 45904.7057509308 learn_rate 0.0021441554540398 
loss 45903.0483995145 learn_rate 0.00225136322674179 
loss 45901.8588413526 learn_rate 0.00236393138807888 
loss 45900.9391501432 learn_rate 0.00248212795748282 
loss 45900.194321319 learn_rate 0.00260623435535697 
loss 45899.5751902847 learn_rate 0.00273654607312481 
loss 45899.0555841606 learn_rate 0.00287337337678106 
loss 45898.6220378435 learn_rate 0.00301704204562011 
loss 45898.268764761 learn_rate 0.00316789414790111 
loss 45897.9950388842 learn_rate 0.00332628885529617 
loss 45897.8037673294 learn_rate 0.00349260329806098 
loss 45897.700690663 learn_rate 0.00366723346296403 
loss 45897.6939357474 learn_rate 0.00385059513611223 
loss 45897.7937792467 learn_rate 0.00192529756805611 
loss 45892.976324108 learn_rate 0.00202156244645892 
loss 45890.4639557737 learn_rate 0.00212264056878187 
loss 45888.8644429783 learn_rate 0.00222877259722096 
loss 45887.7216401725 learn_rate 0.00234021122708201 
loss 45886.8433159672 learn_rate 0.00245722178843611 
loss 45886.1369224008 learn_rate 0.00258008287785791 
loss 45885.5544600541 learn_rate 0.00270908702175081 
loss 45885.0703333802 learn_rate 0.00284454137283835 
loss 45884.6713552588 learn_rate 0.00298676844148027 
loss 45884.3518478378 learn_rate 0.00313610686355428 
loss 45884.1110881832 learn_rate 0.003292912206732 
loss 45883.9519140929 learn_rate 0.0034575578170686 
loss 45883.8799423724 learn_rate 0.00363043570792202 
loss 45883.903131353 learn_rate 0.00181521785396101 
loss 45879.7659284379 learn_rate 0.00190597874665906 
loss 45877.5282146721 learn_rate 0.00200127768399202 
loss 45876.0709330957 learn_rate 0.00210134156819162 
loss 45875.0124345787 learn_rate 0.0022064086466012 
loss 45874.1879474429 learn_rate 0.00231672907893126 
loss 45873.516677913 learn_rate 0.00243256553287782 
loss 45872.9560651801 learn_rate 0.00255419380952171 
loss 45872.4830569771 learn_rate 0.0026819034999978 
loss 45872.0855253005 learn_rate 0.00281599867499769 
loss 45871.7580062009 learn_rate 0.00295679860874757 
loss 45871.4994586725 learn_rate 0.00310463853918495 
loss 45871.3120342633 learn_rate 0.0032598704661442 
loss 45871.2003842702 learn_rate 0.00342286398945141 
loss 45871.1712702017 learn_rate 0.00359400718892398 
loss 45871.2333558777 learn_rate 0.00179700359446199 
loss 45867.242214945 learn_rate 0.00188685377418509 
loss 45865.0825371978 learn_rate 0.00198119646289434 
loss 45863.6800557353 learn_rate 0.00208025628603906 
loss 45862.6659327981 learn_rate 0.00218426910034101 
loss 45861.8803228429 learn_rate 0.00229348255535807 
loss 45861.2446530034 learn_rate 0.00240815668312597 
loss 45860.7174686713 learn_rate 0.00252856451728227 
loss 45860.2763014558 learn_rate 0.00265499274314638 
loss 45859.9093343604 learn_rate 0.0027877423803037 
loss 45859.6112564226 learn_rate 0.00292712949931889 
loss 45859.3810786366 learn_rate 0.00307348597428483 
loss 45859.2209337884 learn_rate 0.00322716027299907 
loss 45859.135400388 learn_rate 0.00338851828664902 
loss 45859.1311225107 learn_rate 0.00355794420098148 
loss 45859.2166069415 learn_rate 0.00177897210049074 
loss 45855.3565031011 learn_rate 0.00186792070551528 
loss 45853.2637493063 learn_rate 0.00196131674079104 
loss 45851.9066199651 learn_rate 0.00205938257783059 
loss 45850.9284579374 learn_rate 0.00216235170672212 
loss 45850.174067174 learn_rate 0.00227046929205823 
loss 45849.5669381216 learn_rate 0.00238399275666114 
loss 45849.0666164547 learn_rate 0.0025031923944942 
loss 45848.6511389043 learn_rate 0.00262835201421891 
loss 45848.3089373724 learn_rate 0.00275976961492985 
loss 45848.0348032398 learn_rate 0.00289775809567634 
loss 45847.8277574369 learn_rate 0.00304264600046016 
loss 45847.6898787259 learn_rate 0.00319477830048317 
loss 45847.6256432119 learn_rate 0.00335451721550733 
loss 45847.6415527509 learn_rate 0.00167725860775366 
loss 45844.3256322345 learn_rate 0.00176112153814135 
loss 45842.4624768527 learn_rate 0.00184917761504841 
loss 45841.2258512714 learn_rate 0.00194163649580083 
loss 45840.3187536016 learn_rate 0.00203871832059088 
loss 45839.6088352596 learn_rate 0.00214065423662042 
loss 45839.0296588081 learn_rate 0.00224768694845144 
loss 45838.5455495598 learn_rate 0.00236007129587401 
loss 45838.1368420099 learn_rate 0.00247807486066771 
loss 45837.7929737503 learn_rate 0.0026019786037011 
loss 45837.5089982149 learn_rate 0.00273207753388616 
loss 45837.2837260579 learn_rate 0.00286868141058046 
loss 45837.1186960785 learn_rate 0.00301211548110949 
loss 45837.0175929322 learn_rate 0.00316272125516496 
loss 45836.9859186791 learn_rate 0.00332085731792321 
loss 45837.0308168176 learn_rate 0.0016604286589616 
loss 45833.8299753481 learn_rate 0.00174345009190969 
loss 45832.0299191666 learn_rate 0.00183062259650517 
loss 45830.8374971244 learn_rate 0.00192215372633043 
loss 45829.9657805975 learn_rate 0.00201826141264695 
loss 45829.2864431699 learn_rate 0.0021191744832793 
loss 45828.7349116361 learn_rate 0.00222513320744326 
loss 45828.2764592964 learn_rate 0.00233638986781543 
loss 45827.8919291427 learn_rate 0.0024532093612062 
loss 45827.5710339366 learn_rate 0.00257586982926651 
loss 45827.3089658039 learn_rate 0.00270466332072983 
loss 45827.1045862586 learn_rate 0.00283989648676632 
loss 45826.9594230294 learn_rate 0.00298189131110464 
loss 45826.8771022014 learn_rate 0.00313098587665987 
loss 45826.8630279492 learn_rate 0.00328753517049287 
loss 45826.9242110745 learn_rate 0.00164376758524643 
loss 45823.8264394341 learn_rate 0.00172595596450875 
loss 45822.0804514369 learn_rate 0.00181225376273419 
loss 45820.924574713 learn_rate 0.0019028664508709 
loss 45820.0814606958 learn_rate 0.00199800977341445 
loss 45819.4265771464 learn_rate 0.00209791026208517 
loss 45818.8970818295 learn_rate 0.00220280577518943 
loss 45818.4591058696 learn_rate 0.0023129460639489 
loss 45818.0939338864 learn_rate 0.00242859336714634 
loss 45817.7915013402 learn_rate 0.00255002303550366 
loss 45817.5470964674 learn_rate 0.00267752418727884 
loss 45817.3595963928 learn_rate 0.00281140039664279 
loss 45817.2304882932 learn_rate 0.00295197041647493 
loss 45817.1633149352 learn_rate 0.00309956893729867 
loss 45817.1633618852 learn_rate 0.00154978446864934 
loss 45814.5016187512 learn_rate 0.0016272736920818 
loss 45812.9481234025 learn_rate 0.00170863737668589 
loss 45811.8952714896 learn_rate 0.00179406924552019 
loss 45811.1132190425 learn_rate 0.0018837727077962 
loss 45810.4963153268 learn_rate 0.00197796134318601 
loss 45809.9902852727 learn_rate 0.00207685941034531 
loss 45809.5654283257 learn_rate 0.00218070238086257 
loss 45809.2050867948 learn_rate 0.0022897374999057 
loss 45808.9001368437 learn_rate 0.00240422437490099 
loss 45808.6461576986 learn_rate 0.00252443559364604 
loss 45808.4419015609 learn_rate 0.00265065737332834 
loss 45808.2884382198 learn_rate 0.00278319024199476 
loss 45808.1886685401 learn_rate 0.00292234975409449 
loss 45808.147049617 learn_rate 0.00306846724179922 
loss 45808.1694476664 learn_rate 0.00153423362089961 
loss 45805.5991518746 learn_rate 0.00161094530194459 
loss 45804.0972727151 learn_rate 0.00169149256704182 
loss 45803.0806787649 learn_rate 0.00177606719539391 
loss 45802.3274390244 learn_rate 0.00186487055516361 
loss 45801.7351806796 learn_rate 0.00195811408292179 
loss 45801.2511901407 learn_rate 0.00205601978706788 
loss 45800.8465779049 learn_rate 0.00215882077642127 
loss 45800.5051287888 learn_rate 0.00226676181524233 
loss 45800.2179624055 learn_rate 0.00238009990600445 
loss 45799.9807837543 learn_rate 0.00249910490130467 
loss 45799.7923945993 learn_rate 0.00262406014636991 
loss 45799.6538602866 learn_rate 0.0027552631536884 
loss 45799.5680356473 learn_rate 0.00289302631137282 
loss 45799.5392972125 learn_rate 0.00303767762694146 
loss 45799.5734000648 learn_rate 0.00151883881347073 
loss 45797.0847318541 learn_rate 0.00159478075414427 
loss 45795.6269797881 learn_rate 0.00167451979185148 
loss 45794.640306872 learn_rate 0.00175824578144406 
loss 45793.9102999379 learn_rate 0.00184615807051626 
loss 45793.3376721926 learn_rate 0.00193846597404207 
loss 45792.8711566884 learn_rate 0.00203538927274418 
loss 45792.4825968532 learn_rate 0.00213715873638138 
loss 45792.1561623188 learn_rate 0.00224401667320045 
loss 45791.8831712105 learn_rate 0.00235621750686048 
loss 45791.6594180936 learn_rate 0.0024740283822035 
loss 45791.4837244906 learn_rate 0.00259772980131368 
loss 45791.3571264899 learn_rate 0.00272761629137936 
loss 45791.2824120993 learn_rate 0.00286399710594833 
loss 45791.2638598995 learn_rate 0.00300719696124574 
loss 45791.307099402 learn_rate 0.00150359848062287 
loss 45788.8959012182 learn_rate 0.00157877840465402 
loss 45787.4795742427 learn_rate 0.00165771732488672 
loss 45786.5206044153 learn_rate 0.00174060319113105 
loss 45785.8118341188 learn_rate 0.0018276333506876 
loss 45785.2569668234 learn_rate 0.00191901501822199 
loss 45784.8061403718 learn_rate 0.00201496576913308 
loss 45784.4319101197 learn_rate 0.00211571405758974 
loss 45784.1188186052 learn_rate 0.00222149976046923 
loss 45783.8583749672 learn_rate 0.00233257474849269 
loss 45783.6464582432 learn_rate 0.00244920348591732 
loss 45783.4819061139 learn_rate 0.00257166366021319 
loss 45783.3657228345 learn_rate 0.00270024684322385 
loss 45783.3006278121 learn_rate 0.00283525918538504 
loss 45783.2908005714 learn_rate 0.00297702214465429 
loss 45783.341744561 learn_rate 0.00148851107232715 
loss 45781.0048627276 learn_rate 0.0015629366259435 
loss 45779.6281294003 learn_rate 0.00164108345724068 
loss 45778.695440712 learn_rate 0.00172313763010271 
loss 45778.0066487539 learn_rate 0.00180929451160785 
loss 45777.4683590164 learn_rate 0.00189975923718824 
loss 45777.0320771547 learn_rate 0.00199474719904765 
loss 45776.6710533584 learn_rate 0.00209448455900004 
loss 45776.3701946508 learn_rate 0.00219920878695004 
loss 45776.1211969382 learn_rate 0.00230916922629754 
loss 45775.9200217194 learn_rate 0.00242462768761242 
loss 45775.7655221489 learn_rate 0.00254585907199304 
loss 45775.6586708411 learn_rate 0.00267315202559269 
loss 45775.6021194257 learn_rate 0.00280680962687232 
loss 45775.5999496911 learn_rate 0.00294715010821594 
loss 45775.6575407796 learn_rate 0.00147357505410797 
loss 45773.3922093171 learn_rate 0.00154725380681337 
loss 45772.0535500819 learn_rate 0.00162461649715404 
loss 45771.1460051623 learn_rate 0.00170584732201174 
loss 45770.4762020063 learn_rate 0.00179113968811233 
loss 45769.953563741 learn_rate 0.00188069667251794 
loss 45769.5309285296 learn_rate 0.00197473150614384 
loss 45769.1822252398 learn_rate 0.00207346808145103 
loss 45768.8927176348 learn_rate 0.00217714148552358 
loss 45768.6542847279 learn_rate 0.00228599855979976 
loss 45768.4629690146 learn_rate 0.00240029848778975 
loss 45768.3176389855 learn_rate 0.00252031341217924 
loss 45768.2192363479 learn_rate 0.0026463290827882 
loss 45768.1703463196 learn_rate 0.00277864553692761 
loss 45768.1749548121 learn_rate 0.00138932276846381 
loss 45766.2274587396 learn_rate 0.001458788906887 
loss 45765.0370741773 learn_rate 0.00153172835223135 
loss 45764.2105738703 learn_rate 0.00160831476984291 
loss 45763.5887640992 learn_rate 0.00168873050833506 
loss 45763.0953430552 learn_rate 0.00177316703375181 
loss 45762.6898806947 learn_rate 0.0018618253854394 
loss 45762.3496925527 learn_rate 0.00195491665471137 
loss 45762.0617721664 learn_rate 0.00205266248744694 
loss 45761.8188297713 learn_rate 0.00215529561181929 
loss 45761.6172078386 learn_rate 0.00226306039241025 
loss 45761.4557312118 learn_rate 0.00237621341203077 
loss 45761.3350542704 learn_rate 0.0024950240826323 
loss 45761.2572862903 learn_rate 0.00261977528676392 
loss 45761.2257794492 learn_rate 0.00275076405110212 
loss 45761.2450161631 learn_rate 0.00137538202555106 
loss 45759.3632935686 learn_rate 0.00144415112682861 
loss 45758.2112570559 learn_rate 0.00151635868317004 
loss 45757.411665521 learn_rate 0.00159217661732854 
loss 45756.8109237191 learn_rate 0.00167178544819497 
loss 45756.3351674173 learn_rate 0.00175537472060472 
loss 45755.9451720313 learn_rate 0.00184314345663496 
loss 45755.6188979962 learn_rate 0.0019353006294667 
loss 45755.3436992307 learn_rate 0.00203206566094004 
loss 45755.1124886187 learn_rate 0.00213366894398704 
loss 45754.921716374 learn_rate 0.00224035239118639 
loss 45754.7702529936 learn_rate 0.00235237001074571 
 training_time 00:06:17.5435910 
memory 3
Save model to model.txt
=== END program1: ./run learn ../dataset2/train --- OK [379s]

===== MAIN: predict/evaluate on train data =====
=== START program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in
=== END program3: ./run stripLabels ../dataset2/train ../program0/evalTrain.in --- OK [2s]
=== START program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out
loading_time 0.79
ratings range: [0, 5]
training data: 943 users, 1680 items, 90570 ratings, sparsity 94.28306
test data:     943 users, 1680 items, 90570 ratings, sparsity 94.28306
Load model from model.txt
Set num_factors to 943
BiasedMatrixFactorization num_factors=40 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.57186 MAE 3.4713 NMAE 0.69426 testing_time 00:00:00.1747140
predicting_time 00:00:00.4426720
memory 4
=== END program1: ./run predict ../program0/evalTrain.in ../program0/evalTrain.out --- OK [2s]
=== START program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out
=== END program4: ./run evaluate ../dataset2/train ../program0/evalTrain.out --- OK [2s]

===== MAIN: predict/evaluate on test data =====
=== START program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in
=== END program3: ./run stripLabels ../dataset2/test ../program0/evalTest.in --- OK [2s]
=== START program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out
loading_time 0.5
ratings range: [0, 5]
training data: 943 users, 1680 items, 90570 ratings, sparsity 94.28306
test data:     943 users, 1129 items, 9430 ratings, sparsity 99.11426
Load model from model.txt
Set num_factors to 943
BiasedMatrixFactorization num_factors=40 bias_reg=0.0001 reg_u=0.015 reg_i=0.015 learn_rate=0.01 num_iter=30 bold_driver=False init_mean=0 init_stdev=0.1 optimize_mae=False RMSE 3.62542 MAE 3.54868 NMAE 0.70974 testing_time 00:00:00.0083620
predicting_time 00:00:00.0250820
memory 2
=== END program1: ./run predict ../program0/evalTest.in ../program0/evalTest.out --- OK [1s]
=== START program4: ./run evaluate ../dataset2/test ../program0/evalTest.out
=== END program4: ./run evaluate ../dataset2/test ../program0/evalTest.out --- OK [2s]


real	6m32.990s
user	6m24.020s
sys	0m2.772s

Run specification Arrow_right
Results Arrow_right


Comments:


Must be logged in to post comments.