fluxml / fluxml.github.io Goto Github PK
View Code? Open in Web Editor NEWFlux Website
Home Page: https://fluxml.ai
License: MIT License
Flux Website
Home Page: https://fluxml.ai
License: MIT License
Hey there,
the links to the source code of the ML Experiments of the website seem to be outdated.
Examle https://github.com/FluxML/model-zoo/blob/master/mnist/conv.jl
Is your feature request related to a problem? Please describe.
I noticed Flux.jl lack beginner tutorials, not only for particular topics, but also zero-to-one ML textbook with Julia and Flux.jl.
So I create Dive into Deep Learning Julia version in repository D2lJulia. For now, I only finished chapter 2.1
.
Describe the feature you'd like
Would you (FluxML organization) like to receive this repo? Since I'm also a beginner in Flux and ML, so I think this task is best done with the Flux community. I'll continue to finish this work and merge the community pull request.
Describe the issue
The recent posts under https://fluxml.ai/blog/ are generated by an hfun
in Franklin.jl. The title is set to a page environment variable unless it is unspecified, in which case it gets a generic name like "Post 10". Currently, all blog posts have titles, but not all of them are generated correctly by the hfun
.
To Reproduce
Steps to reproduce the behavior (include links to pages or assets):
title
environment variable is sethfun
will read the environment variable and give a default if it is nothing: Line 15 in 660e649
something
logic from the code, then you get an error about assigning Nothing
to a Vector{String}
, confirming that pagevar(url, :title)
is returning nothing
when it should return the title set in Step 2.Step 4 makes me think this is Franklin issue @tlienart.
Desktop (please complete the following information):
We should link: https://twitter.com/fluxml?lang=en somewhere
Changes in #38 seem to have affected the top-level menu as well as the content. We probably just need to make a CSS selector a bit more specific.
Hi,
I notice this issue on several Julia-based websites and I am not sure what is the source of the problem, but @DhairyaLGandhi suggested starting here.
@DhairyaLGandhi posted the the latest article on Torch.jl (looks awesome!), but the first time I visit the page, the CSS is garbled. It looks like it thinks I'm on mobile or something.
When I reload the page, everything is fine and every subsequent time I visit the site, the CSS is fine. It is only that very first visit that is garbled, but that first visit is the most important visit. Especially since I do a lot of Julia evangelizing and if I send the link to someone, I'd like to know what they see isn't garbled CSS.
I can verify that the CSS is consistently garbled on the very first visit by loading the url into a fresh incognito window on Chrome.
This might seem minor, but first impressions are important, so I hope we can track down the source of this problem.
Cheers
in the getting started page the command ps = params(W, b)
returns
ERROR: UndefVarError: params not defined
Stacktrace:
[1] top-level scope
@ REPL[17]:1
[2] top-level scope
@ ~/.julia/packages/CUDA/GGwVa/src/initialization.jl:52
while ps = Flux.params(W, b)
works correctly.
maybe fix the getting started page.
Some posts still appear to need images and other files in https://github.com/FluxML/fluxml.github.io/tree/4be8e6cb60a08f43fbc7c7ea6b36d8cb8b44fe71/assets. e.g. the logos on https://fluxml.ai/blogposts/2021-12-1-flux-numfocus/.
Hey @Keno, per here: JuliaGPU/XLA.jl#45 (comment) has there been any update to this? The Flux site says:
Flux models can be compiled to TPUs for cloud supercomputing, and run from Google Colab notebooks.
is this still true?
The link in the blog-post https://fluxml.ai/2019/03/05/dp-vs-rl.html
Are pointing towards old code https://github.com/FluxML/model-zoo/blob/10bd26ca21079b1c6ee246dd1938beab55178949/games/differentiable-programming/trebuchet/DiffRL.jl
That code is no longer working (packages are no longer in the General registry), and hence it should be pointed to the more current version https://github.com/FluxML/model-zoo/tree/master/contrib/games/differentiable-programming/trebuchet
This would be especially helpful, because as you can see the new link is under contrib/games and hence cannot be found by just switching branches.
At the README.md of Trebuchet.jl there was a similar issue, which was fixed in FluxML/Trebuchet.jl#9
I'm working through some of the tutorials and finding some syntax issues, for example in the multilayer perception on lines 62-63
function getdata(args)
...
# Batching
train_data = DataLoader(xtrain, ytrain, batchsize=args.batchsize, shuffle=true)
test_data = DataLoader(xtest, ytest, batchsize=args.batchsize)
...
end
will throw a method error since DataLoader
expects the first argument to be zip(xtrain,ytrain)
I think there are some other minor syntax issues as well.
I think I found a bug in your docs, though I can't find the right document, where it occurred.
Here params(model)
should be ps (where ps = params([W, b]))
if I'm not mistaken. (Still learning, might be wrong. Though params(model) yields Params([]), which seems wrong enough.
Also, on the same page is this:
in this section I think is the right moment to talk about VSCode as the preferred Julia IDE. I myself was quite confused, what I am supposed to use.
Nice to have: #78 (comment)
The [Dataloader documentation] points to a tutorial page, https://fluxml.ai/tutorials/2021/01/21/data-loader.html,
which is a bit outdated. In particular, the old MLDatasets.jl interface is used
x, y = MNIST.traindata(Float32)
dataloader = DataLoader((x, y))
instead of the new one
x, y = MNIST()[:]
dataloader = DataLoader((x, y))
The docs section provided in the navbar is redirecting to a "page not found" error. The issue might be with the link provided 🤖
Unless the CPU code is un-runnable as is, we should give a CPU only option here: https://fluxml.ai/tutorials/2020/10/18/transfer-learning.html
As part of the NumFOCUS application, there will need to be some governance model laid out. I suggest something like is specific for SciML here: https://sciml.ai/governance/
I've got a working version up at https://gist.github.com/ToucheSir/072097e4d50147cdd97dac00c6622317, but don't know if I'll have time to make the necessary site and model zoo (since the original file was removed in FluxML/model-zoo#293) changes any time soon.
I'm a little sad that the experiments are no longer linked from the main navbar, and I don't think they're linked from anywhere else. You can still navigate to /experiments if you remember the URL, but could we get a link to it back? Or is there a grander plan here?
Hey @lilianabs are you up for writing a simple/quick page on ways folks can contribute / get involved with the Flux Ecosystem? Should be a short list:
Blog Posts, tutorials, attend an ML community call, post on Stack Overflow / Discourse, and then link to Liza's overall contributing guide on the JuliaLang website
https://discourse.julialang.org/t/enabling-2fa-on-major-github-orgs/31957
What do you think about requiring 2FA for all members of the FluxML GitHub organization?
This is a placeholder issue for updating the existing tutorials and adding workflows to periodically test them.
My proposal's text -
Flux's website has plenty of good tutorials, but some of them have been outdated for some
time. These tutorials go stale with every new Julia and Flux release making them unreliable
for newcomers. Additionally, as discussed with Dhairya Gandhi, the FluxBot can be used to
automatically test these examples, but it is currently not integrated with the FluxML
ecosystem. The bot would ideally run all the model-zoo examples on every PR created on Flux.jl.
This section would aim to -
- Update the tutorials on Flux’s website.
- Add tests to (or run) these examples to ensure they do not go stale.
- Integrate FluxBot with the existing FluxML ecosystem and get it running.
I tried triggering the FluxBot (FluxML/Flux.jl#2016 (comment)), but it did not respond. I am guessing that the bot is not deployed at the moment. I did find a buildkite-related file and a PR (DhairyaLGandhi/FluxBot.jl#2) that probably aims to shift it to GitLab's CI, but I am not sure how and where it is deployed currently. I am also not sure if there is a way to run it locally for some personal test repositories.
For the tutorials, should I migrate them from Flux.params
to Optimisers.jl
, or would it be too soon? I will audit all the tutorials and update them as needed!
For testing, I think it would be straightforward to periodically test the model-zoo examples using Julia's doctests, as model-zoo is a Julia package. For the website, I think the -
doctest(source, modules; kwargs...)
definition of doctest
should work.
Also, #136 aims to migrate this website to Franklin.jl
. Is this still in progress or has this been abandoned? I can take this up if required (cc: @logankilpatrick @darsnack)!
I will start by adding these periodic tests to both the repositories!
Hey @darsnack we already started the transition to Flux: https://github.com/tlienart/fluxml-franklin I have somewhat dropped the ball on this but can pick it back up if it's a high priority item. Very busy with too many things these days : )
Per JuliaML/MLDatasets.jl#86 this tutorial no longer works with macOS Monterey
Should be similar to: https://www.tensorflow.org/about
Use JuliaCon talks as case studies
For browser windows below 992px in width the top navigation bar collapses into a mobile-style dropdown. When I click on it in either Chrome or Firefox (mobile or desktop) it animates open but then collapses again as soon as the animation finishes.
Happy holidays!
Describe the issue
HTTPS is supported but not enforced on http://fluxml.ai
To Reproduce
should do a 301 redirect and ideally have HSTS header.
Hey all. This was brought to my attention by this SO question, and I'm not 100% convinced that I'm right about what happens here, so please bear with me.
It looks like what the example on "Getting Started" wants to do is define a linear model from R^5 onto R^2, and then train that with a single example (x, y)
, where x is in R^5 and y is in R^2.
However, when we do the line data = zip(x,y)
, we generate a length-2 iterator containing [(x[1], y[1]), (x[2], y[2])]
and ignore the other 3 elements of x
. This is then fed into the loss function, which is still capable of computing an answer because of the elementwise operations. So what we actually end up doing is training a function from R^1 to R^1, with two examples.
I can replace x = rand(5)
with x =vcat(rand(2), [missing, missing, missing])
and the whole tutorial still runs without a hitch, which seems to confirm that the last three elements of x
are never examined.
Is this intended behavior, or was the line intended to be something like data = zip(eachcol(x), eachcol(y))
?
Hello prospective hacktoberfest contributor! The FluxML community would welcome new tutorials to the Flux website which can generally be found under: https://fluxml.ai/tutorials.html
You can find the source code for the tutorials here: https://github.com/FluxML/fluxml.github.io/tree/main/tutorials/_posts. They are just markdown files.
We would be open to Pull Request which provide a tutorial topic that is not already covered by the existing tutorials. But no need to re-invent the wheel here. If you have a favorite tutorial that you want to try and re-create using Flux, we would love to help and see it!
Find out more about contributing here: https://github.com/FluxML/fluxml.github.io/blob/main/CONTRIBUTING.md and more general ways of contributing (which may not be open hacktoberfest issues but we can happily make them into issues if that helps you) here: https://github.com/FluxML/Flux.jl/blob/master/CONTRIBUTING.md
Another good starting place would be the Model Zoo: https://github.com/FluxML/model-zoo where we have a bunch of existing models but usually without tutorials built around them.
The accuracy of the conv classifier for MNIST on the website is rather poor. This is because the model was trained for only one epoch (if the code from model-zoo) was actually used.
See: FluxML/model-zoo#72.
I have weights for a model in bson format that I can make a PR for if necessary.
Looks like the Go model, or maybe the cartpole one, made some changes to the flux.js scripts that broke other models. We should probably just have a local copy of flux.js etc, so that all the models are completely independent.
cc @Roboneet
Hi, I encountered an error when calling params
when going over the Getting Started
example here.
The lines that caused my error were from step 6:
ps = params(W, b)
Which can be fixed via
ps = Flux.params(W, b)
I am unsure where the source of this doc is so that I could submit a PR for this. I ran it on Julia-1.8.0-rc4, Apple Silicon Mac.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.