parse items of large text file

    Nov 09 2018 | 10:45 am
    Hi List,
    I have a textfile with 7 lines of about 80.000 digits (6 data streams 1 Index) which I need to get out digit by digit in 6 streams at 500hz. If the rate is not all that stable its ok. It seems to be harder than I thought...
    what I tried:
    1. jit.text, and read it line by line into "jit.matrix 1 long 80000" jit.text will read the text but output only char to the long matrix. If I get one line into an such a long type Matrix it will most likely work...
    2. text I can read the text into text but I can only out a line. I I use jit.fill to get it into a matrix the list length is limited
    3. filiein It will read the file and output it item by item but I am not able to restore the original value of the text, since I get raw bytes
    there is probably a way to read it va node.js straight or via npm but I will have to get more into node. If I can gat the Data into an audio file it would help but it is important to be able to restore the values. If I do mapping it should not quantize.
    here is a demo

    • Nov 09 2018 | 12:40 pm
      Hi, with (included in the bach package) you can read the file and access the numbers individually, but you will lose the formatting into lines. Anyway, if your lines are all the same length, calculating the element index is straightforward.
      Hope this helps, aa
    • Nov 09 2018 | 3:20 pm
      Hi Andrea, Yes, this helps! I can save the individual lines as files with an texteditor an make your patch 6 times, this is what I was looking for. thanks a lot Falk