Feathercoin daemon and wallet production version 0.17.0.1

Machine Learning and A.I. predictions for 2018


  • | Tip j_scheibel

    @wrapper ahh i see. that makes a lot more sense 🙂


  • | Tip j_scheibel

    of course now i want to see if i can make my ai write music… but of course I dont have a way to say “this is good or this bad” . hmmmmm


  • Moderators | Tip wrapper

    The updates are to classifying the parts of songs, intro, verse chorus, in a clear consistent way the neural net can learn easier (less noise). Also writing out all the verses, instead of repeats.

    I’m about 25% through the latest update, but adding more Tabs would also improve the output.

    I’m using a 1400 neural net and 256 character buffer i.e. 8 layers of same size. The system has learned the Tab structure, alternating lines of music and text. It starts songs with a title and composer. It nearly did a couple of rhymes …


  • | Tip j_scheibel

    you sound like you are well beyond this part but you might see if there are any datasets of guitar tabs or music scores at https://www.kaggle.com/datasets?sortBy=hotness&group=public&page=1&pageSize=20&size=all&filetype=all&license=all for you to digest. there are probably far better repositories of information you can use (and you probably have already found them) but i figured i should at least mention it


  • Moderators | Tip wrapper

    Yes, I joined that (kaggle) when you said about it, I’ve had a look round.

    (Music) Not the sort of thing that can be common, except say, midis of classical music, or abc format which is text for, one line tunes like folk dance music. Both of which need a data alignment layer, to reduce the de-noising work.


  • | Tip j_scheibel

    i’m curious how did you decide the number of layers to use? i actually put layer architecture in to the genetic algorithm i wrote. (for various reasons, partly so it could mimic that structure but also so I could import such networks as a starting spot) Because neural nets work fundamentally differently then a genetic algorithms (reinforcing connections vs reinforcing whole models) more than 2 layers never gets me very far.

    regardless i was curious of how one decides 8 is enough as it were


  • Moderators | Tip wrapper

    I did a year of experiments trying to increase the number of layers. The eight layers with 1400 neurons was the maxim I could get, with the extended input layer of 256. Increasing the the buffer helps as neural nets have trouble with memory beyond the buffer.

    I would also have like to restrict some layers, as this helps to extract higher level relationships, but that would have meant learning lua and customizing the char-rnn code.

    I haven’t done any runs for a while as working on 0.9.6.x. The last time there had been some improvements to the code and I was able to increase the neurons from max 400 to 1400.

    Heres some extended layer experiments I did with evolvehtml. https://github.com/wrapperband/evolvehtml


  • Moderators | Tip wrapper

    Like now, if i watch a video about Apollo - I have be convinced the world is flat.

    The A.I. running youtube is trying to convince me all the horrors of totalitarian machine learning systems, that are being used to spy on everyone, are going to be blamed and associated with “Blockchain” and a “Singularity”. False flag against Blockchain?


  • | Tip j_scheibel

    i for one support our new a.i. overlords… hehehe


  • | Tip bluebox

    It’s already been the big thing for a couple years.

    I can’t tell you specifics of what we’re doing with it, but we’ve had an 8x volta 80Gbit nvlink system on order for a couple months now. 🤤

    Imagine what 40,000 cuda cores, 5,000 tensor cores, and HBM crazy-fast memory all ganged up in one box can do these days. Distributed systems have been designed to allow several multi-gpu boxes to be connected. That’s the reason ML/DL has taken off so quickly; this kind of raw power was practically unthinkable a mere half dozen years ago, and is making models only dreamed of before workable today. But then we all knew where gpgpu was headed anyway…

    I had picked up a book on neural networks twenty years ago, and wondered then just what good that would ever serve in our lifetime. Who knew.

    (btw, quantum computing is at the same stage now that nn’s were back then… :))


  • | Tip F3derz

    anything related to this or predictions for 2019?