I am currently taking MIT’s ConLang class. Over the course of this semester, I’ll construct a fictional language called Nysaerith (IPA: /njsa:eriθ/). I think the best way to improve upon a project like this is to get a ton of feedback. So, I’ll post periodical updates on my progress. Let’s get started!
No language can exist without a fictional world. So, I’ll set the background first. In a fantasy universe, Nysaerith is the language of the magic itself. It binds the world and guides it. Words in Nysaerith have real magical power. Those who speak
become one with the world, and the world bend to their will. They are called mages. There were many mages in the early days. They could move mountains, call upon storms. However, soon their desire for power corrupted them. They fought and massacred each other until no-one who truly understood Nysaerith was left alive.
Many years have passed since then. Now Nysaerith is only preserved in ancient runes and in the incantations that were passed down orally. As the language evolved, its magical power also waned significantly. If this was a fantasy novel, the main character would travel around the world to reconstruct Nysaerinth haha.
Anyways, I think this background is sufficient.
Here is the phonemes of Nysaerinth and also a list of twenty words. You’ll notice that there are several uncommon consonants and long vowels. This is intended because I want this language to sound mystical and rhythmic. Later I’ll write many spells in it.
I am continuously improving the language, hence some discrepancy in the future posts is expected. That being said, any criticism of my word list is welcome!
Variational Autoencoders (VAE) are really cool machine learning models that can generate new data. It means a VAE trained on thousands of human faces can new human faces as shown above!
Recently, two types of generative models have been popular in the machine learning community, namely, Generative Adversarial Networks (GAN) and VAEs. While GANs have had more success so far, a recent Deepmind paper showed that VAEs can yield results competitive to state of the art GAN models. Furthermore, VAE generated images retain more of the diversity of training dataset than GAN counterparts. Continue reading An Introduction to Variational Autoencoders→
Let’s have a look at this god-tier math puzzle. Only one out of seven gets it right, and the other six don’t. So, what are the odds for solving it correctly? 1 to 6. Generally, if \(p\) is the probability that someone will get it right, then his/her odds are \(p/(1-p)\).
However, it isn’t necessarily true that \(p=1/7\) for every person because some people are smarter, some have better education and so on. Hence, \(p\) also depends on the person attempting the puzzle. In a Bayesian framework, we capture this dependence with conditional probability.
My mistress’ eyes are nothing like the sun;
Coral is far more red than her lips’ red;
If snow be white, why then her breasts are dun;
If hairs be wires, black wires grow on her head.
I have seen roses damask’d, red and white,
But no such roses see I in her cheeks; Continue reading A Week of Poetry: Day 3→
This is a very unusual post for this blog. I hope my regular readers will bear with me.
I love Game of Thrones. I have read the books and the companion novels. I have been following the series for years. I am familiar with most of the fan theories too. Tonight the series finale aired. I didn’t like how the show ended. It felt rushed and very baffling. Continue reading How Game of Thrones Should Have Ended→
The distribution of primes plays a central role in number theory. The famous mathematician Gauss had conjectured that the number of primes between \(1\) and \(n\) is roughly \(n/\log n\). This estimation gets more and more accurate as \(n\to \infty\). We use \(\pi(n)\) to denote the number of primes between \(1\) and \(n\). So, mathematically, Gauss’s conjecture is equivalent to the claim