My DICT is too big.
Hello Fellow Patchers,
WHAT I’M DOING:
I am trying to create an automated interactive karaoke system where users will select songs from a HUGE karaoke database and the songs will automatically be cued by the system. What I would like to have happen is to have two separate systems where users could search by either Artist or Song Title. I thought I could use the dict object to create a hierarchical database that I could use to populate umenus. For example, if "By Artist" is selected, the dict would populate one umenu with all Artists, and selections from this menu could call up from the dict the songs available from that artist to populate a second umenu.
WHAT MY PROBLEM IS:
I built an iterative system to build the dictionaries from text files by using zl.iter and a sprintf object to format them into messages to the dict object. The process takes REALLY LONG! Like 15+ minutes for a single volume. There are so many entries that zl object stops reporting them even though I set the @zlmaxsize really high and I have to feed the text in in chunks. Apart from the time involved in building the dictionary, I cant seem to call up the data correctly. I think for some reason the highest level of my data structure is stored as a key. The usual messages like "getnames" or "getkeys" do not return the correct data. I can see the data, which appears to be hierarchically correct, with a dict.view object, but I cant dig down past the highest level with a "get" message.
Am I missing something? Can I reformat the dict somehow to return the information correctly (ie move the highest level from a key to a dictionary)? Is there a faster way to build the dictionary? Is there a different way to create a menu system like I describe at the top of this post directly from the file structure on the hard drive and forgo a dict?
Here is an example patch that shows what I am trying to do. I am also attaching the text file for the first volume if you are brave enough to feed it into the system to build the entire first volume dictionary (of 23 volumes) to see what I am talking about when it comes to performance.
-- Pasted Max Patch, click to expand. --Copy all of the following text. Then, in Max, select New From Clipboard.----------begin_max5_patcher---------- 2649.3ocyas0bZijE9Y7uhtn1cyKdcoaHI3owWxEuINI6ZOqqohSkpA0.8Xg DkTCIjol+66ouHg.c0LBVU0LlP2Gc5S+ctzWzG+wY85ON7Gj39nQnuf506ON qWOQS7F5o9du9Kv+XhONVHV+Ej3X7LR+yk8wH+fIZeFggdp+6wQ3vmIiFc0F zkQLZLazn+an+pEDj9nQCcQ2PlEQHwO0OQAKwrIyoAy9VDYBSZHlttWL3bjo o1EZmiLrEen6dgF5qpGZZX.Kl9SBWdcCnaYyAqVPC7ILgkZrswvUrjV0UsJa hsYIQNl86uixCvKDcz+xHJ1OwVodh1BG+6+S8goSfHRLIfgYzvfLSBKWCg0a H9PS8GzW4Ozed1Y7+b9gA4EfXNNB.ycHeXFn2EwK2ZvK8AVh4fSqiWognpfy j18oAjIgqBXYm8E.t5l1YQWgoYp2sPWmZPWCccI5peBhF0G5Jvnr.VWKbzte KFXMwmfip.Pr0jAMZBz2zoKBHF0UOyxTTOyU6ubDzjvEKfQHGLdIZMIZCJdA 12GEiWrzmfBmhXyInoznXFZsXkjWZ9qkkrfnsnxh9fAhnRiWH7qWH7mH5znY iEeO8gGuZ7XeRL0ijUNYqY0Wi7NCayn0m5OfMGcCE7AwfO9o9v51+5RD7e+C zkeGuQzPdQdj34A.K5JB3ctxeEeQbnYGPv2QvqIbotmKvS8cQ2yBWFibDR7u AQ4ASnWycuLt2Qz9PzM3Mbkb43XvwxH9afV00PWOAZ7lvf.r56WKj+lHB1yO bxyn2E5S8TFZh7O0+1Ws.8wPF51.zGBWS1q2+Cf7jHzUq749tzNkp9AtQEid j.CK5MgQEofG4Qk2yfcuvP2Oe0zo9oR7aP5erRQQgqlMG8.DydaDMdeIdDGC ySbfmnYHERSCcGNfhmj77PxGryIhPCJ7QH59RdEYBdUrTtORmMmUlf2EJzFN .9CsT08.X2PjI5N0jx.8V7pYxubypfmoD.5XrjNA+afJ.3Qo89o.0SNDce3B B2M+J43iidF8H02WJtgF5M.XST.18yCiXPjF3UAyQJgNjpeMTfXETL3sznjG r3lM0P2CkAB7hQODB5Po2q8CiA2MzDGfdsmZpTnzuGLtx0z6EgFWBtLHPFxG j5ABwWRl.Yqhu9FLTvLVZVE0ODv5AHDDVAVk+lhDQNznqAG0DRQB7.AOYNeJ I+TzlkA7HxXRw2+.DciGuA7FSQWEEh8RRm2WvOiiXadUL5SqUZZfl.ak9aZ7 jPHMZJIJHLWu2h3a0+Nv+uIeeSQ2htFG7JV9tBfvYz09qFuWWe9hau3tK97E 6z3i3fmiY38zwiz.uvuCYegKWtqcCktdMTrlrPM6XTnQndxiXJKodyVQ+.0G b5KTgmynS.nAh.2SpOhY75AyloB+PIExh2SvO4SWSwprRHH8MQvn6smP7vv2 B04UwCeWVrYGgdCAytPMO30h8vKPefrlpxrtaCX3SInubMrcifuJqgl4DUYJ J.t+eKbUQR7V5ZdRN5esBLTHmMSsi8j7c3HO9r4gsEr2Ufa4EK+x0PrHMdNe 7951S0syZy550uWQ9ebrsjqPazo1ZzAsUwx1iyq+Q59ZlCAAfHPvRLBCUPwd dDuQUcJuAxsfYaK9Xn73wZG4MxzHLxz5P.o.x2gGNGF4CEtFiClUATHOasi. HraSbnzfFgA8xBbLMZyHmq90a+vMnat85GpBWLRylzGzkhOzaycvBK.8LYSb UWgjg71DLbS1rem6DWFVsLjvGwpvDCa4wOkXhsYWDSzZYLovqjrxKez.Nm6. 94bkwLlVcwCqO3DG5XaIWkV2oyF5na1+nbYVtxRpVtxvBsN4sY05oMEcO90m zntfKcytX.xAUtcLbl2vfJl4xaESMwMxuha6skirWGzAskTO5D1EqojuWwzY fHA2RdUYlxDcCcyW1DRq.KtM2e3O8ufxfSBaf9ke5CO.OtBYoMvt1IVxc.Z4 bPaMp3.TiRCPO+EGjNzrMyjiHK8wSHuf2K2nQM9TcU8FSToCVxq8NYwiNU4. WyVLhbIfzvQ9QJ.u13PQJ04ngNmhSt7RAF8VDXhWFQCXSQO8Tcwf+8X9+ykq 9rXW2LuX3VJKt0fOsS5aVZnLSyYXm8MK43zlmBlulHhEh3qig3qnUE3XH.GK MYQe4GciyC6bPIYS8CAqnpaHRtvlot11275AOgWB4qKHvBseiDfUu0JsW1Jf fAiY7kAOjaOQ6XrT3BLM3FHrYzn3Ui4+CiQiFCdOipNBfrZsklb6PpM4NrLX 07+KYYFmDzRWhVtUEC5JiAcFzcQKC6SEZgQ+M8FjvZIulA4kg20BsLOJmbxw vLS04S0QmbrNVGcR4JGXplOxKJv5u9YmLLZwMjUyRlpIgkbm6CFJuyiVY+DV k4q3lDML.GsQbfoyOnCN4LHowX7Zh22f1fI02vLVDEBEkbbrWJV0qOYwXhWF Duj06D8If9VzAf9EwnizqcqtpKqcndKtS2iriPKoQOLCuOpmPAOUyIs2qe5Y BR6Jsud8SNkv1911YucopRFQxHSucH3hvVUVor27baQJSpHfae6.tkxKkMZb dvTkFRIHSYJn.NyTk9jTooLkskeMUoCASWJSEB13jG21ixM4EXWV2T.vmmTM 0YiWWpMtO+fxObExymZFPA0cJaL2muOEMCUL9o5gYGp.U9nsO6fpQoMPm43Q T9ovtLHp.TMk6PUZNJJEUp6aKKipQMRxGUld1xGoJUSFZJUll1xbopzzVBMU Z0fc33Tk5JGGmJSmEKYY5MO0lJSu44MUd+MmuTE05t7jpR6IkZSkYGYYTU9w ZO9TkWfcXSUAQr6xkppL0LLmpLasHVWkeLyR4ppFPEafJMMIK4rJnh7V1YUP mozypf9R4mUAlthhVEUcSvSqB5XGpZ0f4aJCtJahmiTWMSoJtdUdNdJ8uZl9 RYEV4YlaIJVyToj+XkWfNkRYMScILMqLElR9rlntx4jVoHphlZUtekzW3PcK KIYxV93qh4wVAUJxxhshRWfMBjyROa++Ux+P8Yxb4EeVC8S6YMf0kgfUwWqg JUxepF15czWKlt9Q4JH1RVpS10ObHyCtCj3Qq0IZn38k3tjrMN5W6qUo2Adg 9bZ.6.OS4NzcQ7nhWE4d+tEEvh+1JSJnLNbUzjD6H4mtGZK1vKOPCD+bWxHD +XrYDZN0ySdbujonGMliNdoYt4bpM0dFZ1.6w8zYO1Mvb1ylOllCj62B1yBp 2xPHBTEyH+4voq3yjc5W1NLsl8OnAl+dywiIb5z.yYvoyZ3g050gNt6HzQ2d LpydbNs1iYc1i8Iyd1qLTI0pNcQy7e1hcpZ41cK6wsIEOcMOcwO5cK7woIkC cNg0CaxpEFFmV6o15OZmt5gVcL+kYGydL5X1SSVuXn0Iydr5VkeLah6Zu0TN p1SSbWNmt89aX2sBmM5XwOFZcK6QePGydNpkmk2DAd4x0jnz28L2T5u.+6gh K819bwWoAxuJtik9Qj0zD4kBfilLmxHSXqhj2DxOb4LTBFm+7r+mIYNM7 -----------end_max5_patcher-----------
Thank you for your help. Happy Patching!
first thoughts :
-for this kind of very long list processing i’d recommend using the bach library, which is more at ease with (very) large lists.
-using filein might help too.
-dict *should* be a good tool for your aim. didn’t use it extensively and couldn’t say why it’s not working the way you want tho. But again, the bach library might help, since they use a dict style help system, very well working.
"Is there a faster way to build the dictionary? Is there a different way to create a menu system like I describe at the top of this post directly from the file structure on the hard drive and forgo a dict?"
SQlite could handle something like this provided you have the time/inclination to get your head around it, and the discrete files in your file structure are readable by Max (most likely) . From the sound of it, it would probably be way faster too. There is not that much info on SQlite in the Max docs but a forum search should yield some results (look for Andrew Bs MovieBase tutorial as a point of departure)
a database is the best for your case (many entries). searching/sorting by interpret or title will be easy. if you got 23 volumes of this it can’t be handled by any max storage object.
Hey Spectro, i can’t find the Andrew Bs MovieBase on Google or in Cycling search… what is it ?
edit : probably that http://cycling74.com/2008/09/05/data-collection-building-databases-using-sqlite/
- This reply was modified 1 year by vichug.
Yep that’s the one – well done! I couldn’t find the tutorial when I posted that – obviously I didn’t try too hard. The term MovieBase which I could recall as the name of the database/js file in one of those tutorial patches was all I could manage – but a poor term to search for the tutorial with. Sorry…
Hah, no worries. That sqlite thing seems as powerful as undocumented :) but maybe not too hard to grasp, and there’s probably a lot of doc about it outside of the Max realm.
once you get the main concept of how sqlite is linked with max all the rest is inside the sqlite docs.
as a start before adding the extra max-layer on top of SQL/Sqlite i’d recomment to go through some SQL tutorials as the thinking/logic behind SQL-database scripts differs a lot from other kind programming.
I’m proud of you, forum. This whole thread, not one lurid pun about the thread title.
it took me a while to realize the punny intent of the OP. After what i decided to leave it be, in its pure, discrete, self-sufficient essence ;)
I’m proud of you, forum. This whole thread, not one lurid pun about the thread title.
–> We are either all intellectuals or REAL GEEKS!!!
Forums > MaxMSP