Git Product home page Git Product logo

hydra's Introduction

Hydra

All Contributors

hydra

Set of tools for livecoding networked visuals. Inspired by analog modular synthesizers, these tools are an exploration into using streaming over the web for routing video sources and outputs in realtime.

Hydra uses multiple framebuffers to allow dynamically mixing, compositing, and collaborating between connected browser-visual-streams. Coordinate and color transforms can be applied to each output via chained functions.

Note: experimental/in development. Right now only works on Chrome or Chromium, on machines with WebGL. I welcome pull requests as well as comments, ideas, and bugs in the issues section =]

For more information, see getting started, getting started pdf en español, tutorials and examples, the complete list of functions, gallery of user-generated sketches, or a a talk about the motivations for creating hydra.

Note: this repository is for the online version of hydra. Other pieces of hydra are published as separate modules:

Getting started

Go to https://hydra.ojack.xyz

  • CTRL-Enter: run a line of code
  • CTRL-Shift-Enter: run all code on screen
  • ALT-Enter: run a block
  • CTRL-Shift-H: hide or show code
  • CTRL-Shift-F: format code using Prettier
  • CTRL-Shift-S: Save screenshot and download as local file
  • CTRL-Shift-G: Share to twitter (if available). Shares to @hydra_patterns

All code can be run either from the in-browser text editor or from the browser console.

Check @hydra_patterns for patterns folks have shared as an easy way to get started.

Basic functions

render an oscillator with parameters frequency, sync, and rgb offset:

osc(20, 0.1, 0.8).out()

rotate the oscillator 0.8 radians:

osc(20, 0.1, 0.8).rotate(0.8).out()

pixelate the output of the above function:

osc(20, 0.1, 0.8).rotate(0.8).pixelate(20, 30).out()

show webcam output:

s0.initCam() // initialize a webcam in source buffer s0
src(s0).out() // render source buffer s0

If you have more than one camera connected, you can select the camera using an index:

s0.initCam(1) // initialize a webcam in source buffer s0

webcam kaleidoscope:

s0.initCam() // initialize a webcam in source buffer s0
src(s0).kaleid(4).out() // render the webcam to a kaleidoscope

You can also composite multiple sources together:

osc(10)
  .rotate(0.5)
  .diff(osc(200))
  .out()

By default, the environment contains four separate output buffers that can each render different graphics. The outputs are accessed by the variables o0, o1, o2, and o3.

to render to output buffer o1:

osc().out(o1)
render(o1) // render the contents of o1

If no output is specified in out(), the graphics are rendered to buffer o0. to show all render buffers at once:

render()

The output buffers can then be mixed and composited to produce what is shown on the screen.

s0.initCam() // initialize a webcam in source buffer s0
src(s0).out(o0) // set the source of o0 to render the buffer containing the webcam
osc(10, 0.2, 0.8).diff(o0).out(o1) // initialize a gradient in output buffer o1, composite with the contents of o0
render(o1) // render o1 to the screen

The composite functions blend(), diff(), mult(), and add() perform arithmetic operations to combine the input texture color with the base texture color, similar to photoshop blend modes.

modulate(texture, amount) uses the red and green channels of the input texture to modify the x and y coordinates of the base texture. More about modulation at: https://lumen-app.com/guide/modulation/

osc(21, 0).modulate(o1).out(o0)
osc(40).rotate(1.57).out(o1)

use a video as a source:

s0.initVideo("https://media.giphy.com/media/AS9LIFttYzkc0/giphy.mp4")
src(s0).out()

use an image as a source:

s0.initImage("https://upload.wikimedia.org/wikipedia/commons/2/25/Hydra-Foto.jpg")
src(s0).out()

Passing functions as variables

Each parameter can be defined as a function rather than a static variable. For example,

osc(function(){return 100 * Math.sin(time * 0.1)}).out()

modifies the oscillator frequency as a function of time. (Time is a global variable that represents the milliseconds that have passed since loading the page). This can be written more concisely using es6 syntax:

osc(() => (100 * Math.sin(time * 0.1))).out()

Desktop capture

Open a dialog to select a screen tab to use as input texture:

s0.initScreen()
src(s0).out()

Connecting to remote streams

Any hydra instance can use other instances/windows containing hydra as input sources, as long as they are connected to the internet and not blocked by a firewall. Hydra uses webrtc (real time webstreaming) under the hood to share video streams between open windows. The included module rtc-patch-bay manages connections between connected windows, and can also be used as a standalone module to convert any website into a source within hydra. (See standalone camera source below for example.)

To begin, open hydra simultaneously in two separate windows. In one of the windows, set a name for the given patch-bay source:

pb.setName("myGraphics")

The title of the window should change to the name entered in setName().

From the other window, initiate "myGraphics" as a source stream.

s0.initStream("myGraphics")

render to screen:

s0.initStream("myGraphics")
src(s0).out()

The connections sometimes take a few seconds to be established; open the browser console to see progress. To list available sources, type the following in the console:

pb.list()

Using p5.js with hydra

// Initialize a new p5 instance It is only necessary to call this once
p5 = new P5() // {width: window.innerWidth, height:window.innerHeight, mode: 'P2D'}

// draw a rectangle at point 300, 100
p5.rect(300, 100, 100, 100)

// Note that P5 runs in instance mode, so all functions need to start with the variable where P5 was initialized (in this case p5)
// reference for P5: https://P5js.org/reference/
// explanation of instance mode: https://github.com/processing/P5.js/wiki/Global-and-instance-mode

// When live coding, the "setup()" function of P5.js has basically no use; anything that you would have called in setup you can just call outside of any function.

p5.clear()

for(var i = 0; i < 100; i++){
  p5.fill(i*10, i%30, 255)
  p5.rect(i*20, 200, 10,200)
}

// To live code animations, you can redefine the draw function of P5 as follows:
// (a rectangle that follows the mouse)
p5.draw = () => {
  p5.fill(p5.mouseX/5, p5.mouseY/5, 255, 100)
  p5.rect(p5.mouseX, p5.mouseY, 30, 150)
}

// To use P5 as an input to hydra, simply use the canvas as a source:
s0.init({src: p5.canvas})

// Then render the canvas
src(s0).repeat().out()

Loading external scripts

The await loadScript() function lets you load other packaged javascript libraries within the hydra editor. Any javascript code can run in the hydra editor.

Here is an example using Three.js from the web editor:

await loadScript("https://threejs.org/build/three.js")

scene = new THREE.Scene()
camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000)

renderer = new THREE.WebGLRenderer()
renderer.setSize(width, height)
geometry = new THREE.BoxGeometry()
material = new THREE.MeshBasicMaterial({color: 0x00ff00})
cube = new THREE.Mesh(geometry, material);
scene.add(cube)
camera.position.z = 1.5

// 'update' is a reserved function that will be run every time the main hydra rendering context is updated
update = () => {
  cube.rotation.x += 0.01;
  cube.rotation.y += 0.01;
  renderer.render( scene, camera );
}

s0.init({ src: renderer.domElement })

src(s0).repeat().out()

And here is an example loading the Tone.js library:

await loadScript("https://unpkg.com/tone")

synth = new Tone.Synth().toDestination();
synth.triggerAttackRelease("C4", "8n");

Running locally

To run locally, you must have nodejs installed. Install node and npm from: https://nodejs.org/en/.

To run, open terminal and enter the directory of the hydra source code:

cd hydra

install dependencies:

npm install

run dev environment

npm dev

Connecting to server from dev/ local editor environment

This repo only contains hydra editor frontend. You can connect to a backend server (https://github.com/hydra-synth/hydra-server) for signaling and gallery functionality. To do this, set up hydra-server from above. Then create a .env file in the root of the hydra directory. Add the url of your server as a line in the .env file as:

VITE_SERVER_URL=http://localhost:8000

(replace http://localhost:8000 with the url of your server)

Audio Responsiveness

FFT functionality is available via an audio object accessed via "a". The editor uses https://github.com/meyda/meyda for audio analysis. To show the fft bins,

a.show()

Set number of fft bins:

a.setBins(6)

Access the value of the leftmost (lowest frequency) bin:

a.fft[0]

Use the value to control a variable:

osc(10, 0, () => (a.fft[0]*4))
  .out()

It is possible to calibrate the responsiveness by changing the minimum and maximum value detected. (Represented by blur lines over the fft). To set minimum value detected:

a.setCutoff(4)

Setting the scale changes the range that is detected.

a.setScale(2)

The fft[] will return a value between 0 and 1, where 0 represents the cutoff and 1 corresponds to the maximum.

You can set smoothing between audio level readings (values between 0 and 1). 0 corresponds to no smoothing (more jumpy, faster reaction time), while 1 means that the value will never change.

a.setSmooth(0.8)

To hide the audio waveform:

a.hide()

MIDI (experimental)

MIDI controllers can work with Hydra via WebMIDI an example workflow is at /docs/midi.md .

API

There is an updated list of functions at /docs/funcs.md.

As well as in the source code for hydra-synth.

CHANGELOG

See CHANGELOG.md for recent changes.

Libraries and tools used:

  • Regl: functional webgl
  • glitch.io: hosting for sandbox signalling server
  • codemirror: browser-based text editor
  • simple-peer

Inspiration:

Related projects:

Contributors

(Adapted from p5.js)

Olivia Jack
Olivia Jack

💻 📝 🐛 🎨 📖 📋 💡 💵 🔍 🤔 🚇 🔌 💬 👀 📢 ⚠️ 🔧 🌍 📹
Jamie Fenton
Jamie Fenton

💻 🤔 📹
Naoto Hieda
Naoto Hieda

📖 📋 💡 🤔
flordefuego
flordefuego

📖 📋 💡 🤔 📹
Zach Krall
Zach Krall

📖 💻 💡
Renzo Torr-
GEIKHA

🐛 💻 📋 💡 🤔 🔌 🌍
Bruce LANE
Bruce LANE

💻 💡 🤔
fangtasi
fangtasi

🌍
Haram Choi
Haram Choi

🌍
papaz0rgl
papaz0rgl

🌍
Artur Cabral
Artur Cabral

🌍
Rangga Purnama Aji
Rangga Purnama Aji

🌍
Jack Armitage
Jack Armitage

💻
Guy John
Guy John

💻
Christopher Beacham
Christopher Beacham

💻
Sam Thursfield
Sam Thursfield

💻
Dmitriy Khvatov
Dmitriy Khvatov

💻
Yancy Way
Yancy Way

💻
tpltnt
tpltnt

💻
Andrew Kowalczyk
Andrew Kowalczyk

💻
ethancrawford
ethancrawford

💻
Hamilton Ulmer
Hamilton Ulmer

💻
Josh Morrow
Josh Morrow

💻
Nobel Yoo
Nobel Yoo

💻
Pablito Labarta
Pablito Labarta

💻
Paul W. Rankin
Paul W. Rankin

💻
Timo Hoogland
Timo Hoogland

💻
Ramil Iksanov
Ramil Iksanov

💻
J. Francisco Raposeiras
J. Francisco Raposeiras

💻
Lars Fabian Tuchel
Lars Fabian Tuchel

💻
oscons
oscons

💻
Richard Nias
Richard Nias

💻
Luis Aguirre
Luis Aguirre

💻
Damián Silvani
Damián Silvani

💻
m. interrupt
m. interrupt

💻
Ámbar Tenorio-Fornés
Ámbar Tenorio-Fornés

💻 🤔

We recognize all types of contributions. This project follows the all-contributors specification. Instructions to add yourself or add contribution emojis to your name are here. You can also post an issue or comment with the text: @all-contributors please add @YOUR-USERNAME for THING(S) and our nice bot will add you.

hydra's People

Contributors

allcontributors[bot] avatar brucelane avatar dependabot[bot] avatar echophon avatar ethancrawford avatar fizzy123 avatar flordefuego avatar frodosamoa avatar geikha avatar hamilton avatar ixnv avatar jamiefaye avatar jarmitage avatar jcmorrow avatar kagel avatar mcscope avatar micuat avatar ojack avatar plabarta avatar rapofran avatar rnkn avatar rumblesan avatar ssssam avatar tmhglnd avatar tpltnt avatar tuchella avatar zachkrall avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hydra's Issues

Negative blend values give black screen

Negative blend values work for some time, but then show a black screen, seems to indicate an error that crashes rendering

Try the following pattern in Hydra editor. After a few seconds, comment in the negative blend value. The colors should bleach (in a cool effect!) and then the entire render goes black. If you comment it back in, the image comes back and is complex, which makes me suspect that there was a rendering error


osc(100, 0.01, 1.4)
	.rotate(0, 0.1)
	.mult(osc(10, 0.2)
          .modulateKaleid(
             voronoi(3))
         )          
	.color(2.83,0.91,0.39)
  .out(o0)


src(o1)
    .blend(o0, () => 0.01)
  //.blend(o0, () => -0.02)


  .out(o1)

render(o1)

osc(10).kaleid(6).luma(0.9).out(o2)```

Forgetting to call a function will crash Hydra

If working with functions it might be easy to forget a function call, which will fatally crash Hydra and will require a restart.

e.g.

var myfunc = () => 5
osc(() => myfunc).out()

Although this example is very simple, it is easy to forget a call when a function is curried.

var timeMod= mod => ({time}) =>  time%mod

osc(timeMod(60)).out() // all good

osc(params => timeMod(60)).out() //ups, forgot to pass params, will crash

using math. (lowercase) as opposed to Math. is freezing Hydra

There's something about this sine function freezing Hydra and not showing any error. Hydra web needs to be reloaded in the browser.

This chain works well

osc(10,0.1)
  .thresh(0.1).scale(3)
.out()

This one freezes Hydra. No error messages

osc(10,0.1)
  .thresh(0.1).scale(()=>math.sin(time))
.out()

OS:Mac OSX - High Sierra
Browser: Chrome Version 75.0.3770.100 (Official Build) (64-bit)
Hydra: Web

Ableton/ csound with Hydra

Hello-

Would anyone have a script for hydra that would allow me to retrieve csound and ableton OSC messages. I understand that csound can transmit OSC events.

Best
Zach

Loopback: using hydra like a fake webcam

Hello! Hello hello!

Is there any way to make Hydra look like a webcam on a computer? Hydra is such cool tech, & it's such a great tool to be able to play with media over. It'd be wonderful if it could be "fed" back in to other programs running on the computer, such that I could use Hydra to output to my video conferencing system, or to my streaming system.

This would probably be operating system dependent. I happen to be on Linux. I feel like there's probably some gstreamer pipeline I could launch from the command line that could make this happen, but it's not my domain. I'll try to report back to this thread with what I find. It'd be lovely if others had resources or ideas or knew how to make Hydra "loopback" like this.

Output vidRecorder buffer to file?

This actually already exists! but i hadnt mentioned it because i wasn't sure about the api/possibly has some bugs.

// run this line to start recording
vidRecorder.start()

// stop recording
vidRecorder.stop()

// set the output video as a source
s0.init({src: vidRecorder.output})

// display the source
src(s0).out()

Originally posted by @ojack in https://github.com/ojack/hydra/issues/26#issuecomment-417385237

Is there a way of taking the video file currently stored in the buffer and simply writing it out to a file? It feels like this would be much more controlled and smoother than taking a screen recording etc. If not, how are you able to produce such smooth and accurate output videos?

Either this, or is there a way to pass some parameters to the screencap(); function something like the saveFrame(); function in Processing for compiling into a video file afterwards? By parameters, I mean something that can suppress the dialog box for path and attribute a file name so that it can be placed in a loop?

I am loving Hydra, thank you for all your work on it!

2 ideas for cool effects to add

2 Ideas for cool functions to add to this. I may try to add them if I can figure out how.

  • a 3d Shape function - so you could render a sphere or a cube or other legible 3d shape, in addition to the existing 2d shapes. I think if it was rendered in greyscale then you could mix effects on top of it pretty easy to make cool effects.

  • A 'datamoshing' effect, which you can get from compression artifacts where you have missing frames. It looks like the image is frozen but then later movement modulates the position of the frozen image. It's used in this music video. It's a really cool effect
    https://vimeo.com/4578366
    This one seems more difficult

Some sketches that use additive feedback won't work when started at the same time as Hydra

Description of the issue

In particular, with sketches that add some output to itself, when made into a link and opened, it would seem that the feedback starts before Hydra can load, so a blank white screen shows. When this happens, it can be fixed via using the hush() function and re-evaluating the code for the sketch. I've tried writing a hush() statement before the rest of the code and it still happens.

Example

The last time I encountered this problem was in this sketch I made

Quick solution I found

I found as a momentary solution putting a hush() statement before the code, and then surrounding the image generating code in a setTimeout() function. Making Hydra wait a bit before executing the image generating code. Here's an example of that

So...

I assume this is Hydra trying to process the feedback before it could process the images themselves. Maybe there could be a way to make Hydra wait a bit more, from the inside.

Impossible to find the IP address of hydra-editor.glitch.me.

Hi,
I'm trying to enter the site, but the server isn't reachable. The equivalent error is DNS_PROBE_FINISHED_NXDOMAIN.
I tried with Linux mint, with Windows and with several browsers but the error is the same. I know that sometimes Windows have some problems with firewalls and block the access to hydra, but I I'm not sure what to to.
It's the second time I enter the site and the first one was from a different wi-fi connection (now I'm trying with a home connection). Do you have problems with accessing hydra, too? Is it a server problem?

Way to pass in logging function to Hydra synth

Certain functions in hydra synth trigger user feedback, which right now is just logged to the console. Provide a way to pass in a log function so that this information can be relayed to the user in either atom-hydra or the online editor.

Capture video into a source buffer

After seeing https://webrtc.github.io/samples/src/content/getusermedia/record/
I think this could be interesting to capture short loops into one of the source buffer (s0, s1, s2, s3) for further manipulation

IE I could record a short clip from the webcam and loop it, as opposed to taking the live webcam feed constantly.

This could also be used to 'bounce down' heavy processing which could help keep framerates up

Similar to #10 but more oriented toward live capture.

better control over which screenshot gets tweeted

When clicking the "upload"-button on https://hydra-editor.glitch.me, the canvas temporarily halts, displaying the current state of the visualization, which suggests to the user that this is the screenshot which is going to be saved.

However, I think the visualization keeps running in the background and when clicking "OK" in the prompt, the (non-displayed) status of the canvas gets tweeted.

I would suggest to temporarily render the output of the canvas once one clicks the "upload"-button and publishing this render, such that one has control over which picture gets published.

Feature Request: Add Midimapping for Numbers

1 - Select the Number you wanna control via MIDI Controller and rightclick it
2 - In Contextmenu select Learn MIDI CC
3 - Click Find Button and move / use your MIDI Controller Fader / Knob / Button / TRS; Enter your Min and Max Values for this Field / Number; Click OK Button

Expected Result:
If one now use the Knob on the CSX MIDI Controller with CC Value 12 one now can control the second modulateRotate Parameter from -12 to 1.7 by moving / rotating / pressing / plugging the MIDI Controller Fader / Knob / Button / TRS.

image

Or you could implement it programatically:

image

Control flow primitives: if()

It would be nice if hydra supported one or more control-flow primitives.

Just the single primitive if() (even without else) could be very powerful. Pseudocode example:

interactive multi-sketch selection:

if(mouse.x > 0.5) { render(o0) }
if({mouse.x < 0.5) { render(o1) }

or a cycling timed sketch sequence:

if(120 < time%360) { render(o0) }
if(240 < time%360) { render(o1) }
if(360 < time%360) { render(o2) }

Passing functions as parameters

Robust and flexible way to handle all types of parameters

  1. javascript functions that compile to glsl functions, i.e:
    osc().add(osc())

  2. javascript functions that are passed in at every render pass:
    osc((t)=>(3*t))
    How can this syntax be improved? should certain functions be provided by default? can this be used for passing osc parameters and/or beat detection, etc?

  3. Static values:
    osc(2.0)

Ideally, the user should not have to think too much about types, and each function should accept all three kinds of inputs and adapt accordingly

mipmapping

could be nice if mipmaps were generated for output textures (maybe they are already?), and to expose a way to set the bias parameter of texture2d() for blur effects (a small bias could be used for reaction-diffusion stuff, a very large bias could be used to get the average colour of the whole image for things like equalisation).

I tried implementing a blur by adding scrolled copies, but failed miserably.

generating mipmaps is an expensive operation, so it should probably be optional...

https://webglfundamentals.org/webgl/lessons/webgl-3d-textures.html (about 2/5 of the way down the page)
http://docs.gl/es2/glGenerateMipmap
http://docs.gl/el3/texture

No extensions folder

I was trying to install the screen-capture-extension following the Readme of the project, but there is not any of the folders mentioned:

To install, go to chrome://extensions Click "Load unpacked extension", and select the "extensions" folder in "screen-capture-extension" in this repo. Restart chrome. The extension should work from now on without needing to reinstall.

Where can I find it?

Thank you!

Add 'hideCode()' function

It'd be useful to have a coded statement that hides the code. Sometimes I want to share some mouse-interactive sketch I made to my non-coder friends and I always have to ask them to do Ctrl+Shift+H to experience it. Adding a hideCode() statement after what I made could make Hydra hide the code at opening. Maybe some sort of Code is hidden, Ctrl+Shift+H to show message should be shown for a couple of seconds so people know that they can read the code if they want to.

hyperlinks sometimes fail to load

hydra is incredibly fun!

When I went to share something I created on a forum, I was noticing an issue where the hyperlink wasn't loading always. If it didn't work, the folowing error would be present in the console:

gallery.js:75 Uncaught DOMException: Failed to execute 'atob' on 'Window': The string to be decoded is not correctly encoded.
    at Gallery.decodeBase64 (https://hydra-editor-v1.glitch.me/bundle.js:402:31)
    at Gallery.setSketchFromURL (https://hydra-editor-v1.glitch.me/bundle.js:384:24)
    at new Gallery (https://hydra-editor-v1.glitch.me/bundle.js:351:12)
    at init (https://hydra-editor-v1.glitch.me/bundle.js:31:18)

The hyperlink is in this post. CMD+clicking to open in a new tab seems to work, but just clicking to open in the current tab doesn't.

Here is the code:

// by John Mitchell
// https://johnmitchell.tech

osc(() => 1 * Math.sin(time * .23) + 2, .004, 100)
  .rotate(.2)
  .blend(o1, .01)
  .diff(o2)
  .contrast(() => .2 * Math.sin(time * .6) + 1)
  .out(o0)

osc(3, -.04, 100)
  .rotate(() => 4 * Math.sin(time * .001))
  .out(o1)

osc(5, -.02, 70)
  .rotate(() => Math.sin(time * .05) + 80 * Math.sin(100 * Math.sin(time * .0001)) * .3)
  .out(o2)

text()

Would be possible/useful/interesting to add text as source? i.e

text('wave').modulate(osc(10)).out()

Not sure how painful is to implement text/fonts in a shader, or if regl provides a simple way to achieve that. Anyhow, super project

Connecting to remote streams

Hi,

First, thanks for developing such an awesome tool!

I'm having problems connecting to remote streams in Hydra. Changing the tab name in the first window seems to work fine, but when I then try to connect using s0.initStream() in the second window, I don't see the remote stream and get the following error in the console (see attached screenshot). Any ideas what's going on here?
hydra_error

As a user, I want to send hydra code remotely through a connection.

It would be very handy to type hydra code in a browser window and make it affect the video feedback in another window in either my machine or a remote connection. This would be very useful for getting video only and also making remote live coding without getting the video compression artifact.

WebGL error when screen sharing Chrome tab

hi !

i get this error:
Failed to execute 'shaderSource' on 'WebGLRenderingContext': parameter 1 is not of type 'WebGLShader'.
when trying to use the screen sharing extension to pull a Chrome tab into hydra using
s0.initScreen()
src(s0).out()
locally or using the remote hydra URL.

update: this also happens immediately when trying to use screen sharing extension with VLC (application window).

here is what the console says when it crashes:
[.WebGL-000000C82EF494A0]GL ERROR :GL_INVALID_VALUE : glCopySubTextureCHROMIUM: source texture bad dimensions.
WebGL: CONTEXT_LOST_WEBGL: loseContext: context lost
Error: (regl) context lost

also i've been on the WebRTC examples, particularly for testing getUserMedia, and it comes back with a NotFoundError. could this be the problem ? how can it be fixed ?

i'm on Chrome 69, windows 8.1. thx !

Error in function freezes editor

Hello Olivia!

I think this bug was introduced on the new version from last week.

This code runs ok:
osc( ()=>Math.sin(time),.1,0 ).out

But if an error is made inside the function, like:
osc( ()=>Math.sin(**timeeee**),.1,0 ).out

It freezes the editor. Fixing the typo and running again does not unfreeze it, the page needs to be reloaded for Hydra to work again.

Using webcam on Windows crashes regl

Hi!

When I try to run the webcam example on the latest build at: https://hydra-editor-v1.glitch.me

ie
s0.initCam()
src(s0).out()

The output loads for a few seconds and crashes to a black screen a few moments later. The console then starts loading the following error over & over.

regl.js:47 Uncaught Error: (regl) context lost
at Function.raise (regl.js:47)
at Object.REGLCommand [as draw] (regl.js:9396)
at passes.forEach (output.js:196)
at Array.forEach ()
at Output.tick (output.js:196)
at HydraSynth.tick (index.js:293)
at Engine. (index.js:52)
at emitOne (events.js:106)
at Engine.emit (events.js:184)
at Engine.tick (index.js:42)

Initial Test - Reproduced on Win10 desktop, Chrome 68, GTX1070, dual 2560x1440 144Hz resolution monitors (one rotated to portrait) and this model of webcam https://www.amazon.com/ELP-2-8-12mm-Varifocal-1-3megapixel-Android/dp/B01N8YH5VY

The error can be cleared by refreshing but fails again a few seconds after initCam()

Test 2 - cannot repro with same webcam attached to older MBP with Nvidia card. also on Chrome 68. This should rule out the camera as cause.

Test 3 - 144Hz is only supported over displayport, & one of my screens is connected via HDMI and runs instead at 60Hz. Unable to repro on the screen that runs at 60Hz. So it appears that it is the increased frequency at 144Hz that is the culprit.

I think the cam only support 24 or 30 fps, so I suspect some sort of timing discrepancy as the root cause.

Discoverability issue on tweet sharing

When I saw you present this, you showed that you could tweet a finished product and save it to a gallery and link other people too it from hydra-editor. However I forgot the keyboard command and I can't find it documented anywhere.

Hydra Atom > Plugin can't be installed. Npm error

I am getting this error while trying to install the Hydra package:
I am running
OSX: High Sierra
Atom: 1.38.2
nom: 6.9.0

See attached log file.

Installing “[email protected]” failed.Hide output…

npm WARN deprecated [email protected]: npm is holding this package for security reasons. As it's a core Node module, we will not transfer it over to other users. You may safely remove the package from your dependencies.
npm ERR! Unexpected end of JSON input while parsing near '...rqWWxyrVrhx6zq2Ze4v1i'

npm ERR! A complete log of this run can be found in:
npm ERR!     /Users/performance/.atom/.apm/_logs/2019-07-16T15_34_25_414Z-debug.log

2019-07-16T15_34_25_414Z-debug.log

More blending modes

After seeing the blending modes @samarthgulati implemented in hydra-blockly * I've realized they would be really nice to have and to experiment with. + i think a lot of people coming from image compositing (more specifically s.w. like gimp, photoshop, after effects, etc) would find it delightful to see more of what they know learning hydra.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.