«Yandex announced the development of a beta version of the YandexART (Vi) neural network, which is designed to create videos five seconds long. She realistically depicts the movements of objects – these could be sea waves, a person or an animal.
In practice, the company offers to use videos created by the model to install animated screensavers on the phone; The neural network will also be useful for bloggers and representatives of creative professions. You can use YandexART (Vi) in the Masterpiece application.
Yandex presented the previous version of the neural network for generating video based on a text request in August last year – it created animation that depicted the movement of the camera, not the object, and the objects themselves could differ noticeably in different frames.
A special feature of YandexART (Vi) is its ability to reproduce realistic movements, taking into account the relationship between frames, due to which objects in the frame move smoothly and believably. To do this, the model was trained on videos with moving objects – for example, it was a moving car or a sneaking cat.
To work with the video generation function, the user just needs to describe in text what he wants to see in the frame. First, the neural network creates an image from which the animation sequence will begin, and then turns the digital noise into a sequence of frames in accordance with the request and the sample – the first frame.