justinas / nosurf Goto Github PK
View Code? Open in Web Editor NEWCSRF protection middleware for Go.
Home Page: http://godoc.org/github.com/justinas/nosurf
License: MIT License
CSRF protection middleware for Go.
Home Page: http://godoc.org/github.com/justinas/nosurf
License: MIT License
The current implementation in tokengen.go
returns a 44-char string with a single padded character (=
) at the end.
It may make more sense to set rawTokenLength
to 24, which will result in a 32 character output string when Base64 encoded, or alternatively 33, which will give you a 44 character string with no padding.
This is very clearly nitpicking (!) but may be worth looking at.
Chrome, Firefox and Safari does not seem to like the response for a failed token verification when using a gzip middware. Chrome reports "This webpage is not available ERR_INVALID_RESPONSE".
What seems to cause the problem is Content-Type: application/x-gzip
. In the example this happens because nosurf failure handler does not set any content type and the gzip middleware sets it to application/x-gzip
if not set.
Have read parts of the HTTP specs but can't really understand if this is a valid response or not. But most browsers does not like it. Would it make sense to change nosurf.defaultFailureHandler
to http.Error(rw, "", http.StatusBadRequest)
instead which will set the content type to text/plain; charset=utf-8
?
package main
import (
"net/http"
"github.com/codegangsta/negroni"
"github.com/phyber/negroni-gzip/gzip"
)
func main() {
n := negroni.New()
n.Use(gzip.Gzip(gzip.DefaultCompression))
n.UseHandlerFunc(func(rw http.ResponseWriter, r *http.Request) {
// simulate nosurf.defaultFailureHandler
rw.WriteHeader(http.StatusBadRequest)
})
http.ListenAndServe(":3001", n)
}
Or is this a bug in the gzip middleware? it should not have a fallback content type (Content-Encoding: gzip
is enough)?
Issue opened for the creation of a wiki page that summarizes the doubts and problems for newbies (#52).
Writing #52 I had a doubt:
I need to use surfing.SetBaseCookie(http.Cookie{Path: "/"})
?
I don't know if many csrf cookies makes sense if I have these routes:
Why nosurf keep creating cookies for these routes? I think I just need one cookie on the Path: "/" and Domain: ".mysite.com".
Where am I wrong?
This may be a bit silly since I don't think it is intended usage, but I find it kind of odd that by default nosurf doesn't actually verify that the CSRF token is generated by the server, just that it is valid so it is relatively easy to bypass.
Anyway, the example code says...
<!-- Try removing this or changing its value
and see what happens -->
OK challenge accepted:
==> GET
ignoring csrf token prnkNlDJPKrNQztxLZ41TnTV9ILVRhpbfTatqo/qbF0= -- posting with cookie MDEyMzQ1Njc4OTAxMjM0NTY3ODkwMTIzNDU2Nzg5MDE= X-CSRF-TOKEN header guBSEnDYkGLIhRVVJKnMdUfrx57ipBW2YhecPtC1ni+y0WAhRO2mVfC8JWQWmvhAcdz/p9KVJ4VWIqoJ6IyuHg== instead
==> POST
response: http status code 200
So I am able to bypass the CSRF protection by just generating my own token and removing the one from the form. Again, I realize maybe this is possibly a silly scenario but I was challenged to break the example code and I did. Client code follows:
package main
import (
"bytes"
"crypto/rand"
"encoding/base64"
"fmt"
"io"
"net/http"
"net/url"
"strings"
"time"
)
const (
tokenLength = 32
)
func main() {
client := &http.Client{}
selfTokenBytes := []byte("01234567890123456789012345678901")
selfCookieToken := b64encode(selfTokenBytes)
selfHeaderToken := b64encode(maskToken(selfTokenBytes))
r := bytes.NewReader([]byte(""))
fmt.Printf("==> GET\n")
req, err := http.NewRequest("GET", "http://127.0.0.1:8000", r)
resp, err := client.Do(req)
if err != nil {
fmt.Printf("err: %v\n", err)
return
}
s := strings.Split(resp.Header["Set-Cookie"][0], ";")
ss := strings.SplitN(s[0], "=", 2)
fmt.Printf("ignoring csrf token %v -- posting with cookie %v X-CSRF-TOKEN "+
"header %v instead\n", ss[1], selfCookieToken, selfHeaderToken)
fmt.Printf("==> POST\n")
form := url.Values{}
form.Add("name", "jolan")
req2, err := http.NewRequest("POST", "http://127.0.0.1:8000", strings.NewReader(form.Encode()))
req2.Header.Add("X-CSRF-Token", selfHeaderToken)
expiration := time.Now().Add(365 * 24 * time.Hour)
cookie := http.Cookie{Name: "csrf_token", Value: selfCookieToken, Expires: expiration}
req2.AddCookie(&cookie)
resp2, err := client.Do(req2)
if err != nil {
fmt.Printf("err: %v\n", err)
return
}
fmt.Printf("response: http status code %v\n", resp2.StatusCode)
}
func b64encode(data []byte) string {
return base64.StdEncoding.EncodeToString(data)
}
// Masks/unmasks the given data *in place*
// with the given key
// Slices must be of the same length, or oneTimePad will panic
func oneTimePad(data, key []byte) {
n := len(data)
if n != len(key) {
panic("Lengths of slices are not equal")
}
for i := 0; i < n; i++ {
data[i] ^= key[i]
}
}
func maskToken(data []byte) []byte {
if len(data) != tokenLength {
fmt.Printf("%v != %v\n", len(data), tokenLength)
panic("data != tokenLength")
}
// tokenLength*2 == len(enckey + token)
result := make([]byte, 2*tokenLength)
// the first half of the result is the OTP
// the second half is the masked token itself
key := result[:tokenLength]
token := result[tokenLength:]
copy(token, data)
// generate the random token
if _, err := io.ReadFull(rand.Reader, key); err != nil {
panic(err)
}
oneTimePad(token, key)
return result
}
I don’t understand the purpose of the encryption used in this library. Here is how it is working:
Encryption:
A
B
A ^ B -> C
BC
BC
Decryption:
B
, and a 32-byte encrypted blob: C
C ^ B -> A
This "crypto" is at best a thin layer of obfuscation. It would be logically equivilent to just send the original 32-byte token as a cookie. In fact, this would then be the Double Submit Cookie pattern.
If the intent is to use the Encrypted Token Pattern (described on the same page, and which I think is way more robust than Double Submit Cookies) then the algorithm would have to work like this:
A
ABC
f(ABC, S) -> C
C
Then when a new request comes in, we:
g(C, S) -> ABC
A
agrees with what was supplied in the form or headerB
agrees with that of the requestC
is within a reasonable expiry durationIf there is no cookie it appears like the token is generated twice.
First in handle.go on line 85 after ErrNoCookie "h.RegenerateToken(w, r)" generates a token. However realToken remains nil.
Now on line 98 it checks "len(realToken) != tokenLength", but as realToken is not set that will be 0!=32 and thus the token is regenerated on line 99.
The BREACH paper states
In order for the attack to be successful,several things are required. To be vulnerable to this side-channel, a web app must:
- Be served from a server that uses HTTP-level compression
- Reflect user-input in HTTP response bodies
- Reflect a secret (such as a CSRF token) in HTTP response bodies
Additionally, while not strictly a requirement, the attack is helped greatly by responses that remain mostly the same modulo the attacker’s guess.
Just so I have this correct, the mask done to the token in crypto.go is to make sure the HTTP cookie header and HTML body do not contain the same string/bytes correct?
This is to avoid BREACH / CRIME styled deconstruction of the compression to find repeated strings? Am I understanding this correctly?
If the bytes are random to begin with, why is the token XOR with the one-time-pad (OTP) since it will be unique with every response anyway?
Using gvt (and presumably other tools that support automatic recursive dependency vendoring) leads to a bunch of dependencies being unnecessarily added when vendoring nosurf.
$ gvt fetch github.com/justinas/nosurf
2016/05/23 19:44:37 fetching recursive dependency github.com/hoisie/web
2016/05/23 19:44:41 fetching recursive dependency github.com/zenazn/goji
2016/05/23 19:44:46 fetching recursive dependency golang.org/x/net/websocket
It looks like this is because of the examples
folder being in the main repository.
Renaming the examples
directory to _examples
would fix this for users of gvt. I'm unsure about other vendoring tools.
My gut feeling is it would be cleaner and safer to move these examples out of the main repository (where they're not actually needed) though. Perhaps to gists and referencing them from the readme instead? I'm happy to do that and send a PR.
handler_go17_test.go
should not use built-in type string as key for value; define your own type to avoid collisions
To prevent clashes across packages using context, the provided key must be similar and should not be of type string
or any other built-in type. Users of WithValue
must provide their key types.
To avoid allocating when assigning to an interface{}
, context keys often have concrete type struct{}
. Alternatively, exported context key variables’ static type should be a pointer or interface.
I came across the same situation here:
http://stackoverflow.com/questions/26818516/processing-multiple-forms-from-an-only-template
And I'm wondering if use an only token for several POST requests might work, and if is secure enough?
First, thank you for fixing enctype="multipart/form-data".
However, now processing a form with reader, err := r.MultipartReader()
results in the following error: http: multipart handled by ParseMultipartForm
.
Using MultipartReader() is advantageous for my use case because it allows me to process the request body (most importantly the file uploads) as a stream.
Is there an easy way to handle the scenario of multiple cookies in context and not just the default csrf_token
?
It's regarding an iframe integration scenario that the same frame will be included multiple times in the page and the way the library is now, each frame will overwrite the csrf_token
meaning if the 1st form submits then it will have a different token than the latest in the context.
For setting the tokens with different names I've managed to simply append the frameId inside the HandlerFunc but the problem is in the verification step where the context is the same.
The gorilla based context used in nosurf needs mutex locking and garbage collection. golang.org/x/net/context is a better context implementation with zero garbage collection created by go team. It'd be good to see nosurf migrating to x/net/context.
Token can be stored in the context like this:
type csrfTokenKey struct{}
...
// in the handler
c = context.WithValue(c, csrfTokenKey{}, token)
And retrieved by:
func ContextCSRFToken(c context.Context) (string, bool) {
v, ok := c.Value(csrfTokenKey{}).(string)
return v, ok
}
We should add the Installation command to the README.md otherwise a new user might get confused!
command have to use:-
go get github.com/justinas/nosurf
I may be missing something, but it appears (in context.go
) that the csrf context is designed to use an in-memory map, with no other options to use something like memcache or redis.
I think this would prevent nosurf from being used in an environment where multiple apps are run behind a load balancer, unless something like sticky sessions are employed.
It would be nice if the in-memory map was abstracted out to an interface, so that anyone could plug in alternative stores. Gorilla sessions does this, and it seems to work pretty well.
nosurf does not currently sign cookies as the standard http.Cookie implementation only defines the "basic" attributes of a cookie.
CSRF cookies should be signed (so we can identify attempts to tamper) with HMAC-SHA256, and then authenticated before checking the cookie against the submitted request. An example of a solid authentication implementation can be found here.
func (h *CSRFHandler) SetAuthKeys(key []byte, keys ...[]byte)
and func (h *CSRFHandler) SetEncryptionKeys(key []byte, keys ...[]byte)
, with the variadic param allowing a package user to pass in multiple key pairs (which facilitates cycling keys). This would be similar to how gorilla/securecookie handles key rotation, but you could probably get away with just accepting a single key.nosurf.New
accept an options struct that then calls securecookie API before then returning a configured *CSRFHandler
.Hi,
Thank you for this library, this works perfectly against CSRF attacks.
However, is there a possibility to use this to combat "CTR+R" (browser resubmits) ?
I am following Trevor Sawler's Golang course, and hit a roadblock with section 8.2. I have detailed the issue here. Could someone please help me out? I have followed the instructions carefully, but it doesn't work as seen in the video. Many thanks!
My co-worker and I were attempting to use nosurf for something. We encountered some issues and once we started looking at them we realized glide picked up the 0.1 tag from 4 years ago rather than tip.
Not a big deal since we should have looked more closely when we added the dep via glide but please consider tagging stables release more frequently or removing the existing tag.
I have some handlers that use POST, PUT etc that I do NOT want to be protected by nosurf. These are REST API endpoints that don't need CSRF protection as you can't use them without an Auth-Token header anyway.
I can use nosurf like this:
n := negroni.Classic()
handler := nosurf.New(mux)
handler.ExemptPath("/v1")
handler.ExemptPath("/v1/bulk")
handler.ExemptPath("/v1/flush")
n.UseHandler(handler)
n.Run(fmt.Sprintf(":%d", port))
But now my poor REST API endpoints get a Vary Cookie and a Set-Cookie too.
Can I use nosurf to protect just individual handler functions and now the whole mux thing?
The CSRF token as it is now might be acquired by an attacker using the BREACH technique (assuming the server has compression turned on).
breach-mitigation-rails and django-debreach both take up an interesting approach with this, encrypting the CSRF token with a new random string on each request. It seems like this could be easily applied to nosurf.
the key needs to be completely secret. in nosurfs implementation the key is raw text pasted to the front of the data.
Nosurf excludes safe methods (like GET) and paths (using ExemptPaths) when there is no need to check CSRF token. It's good, but..
In handler.go
you have a function ServeHTTP
, and after every request even if you are not interested in checking a token, you still do:
.. and this all is useless because then you do:
if sContains(safeMethods, r.Method) || h.IsExempt(r) {
// short-circuit with a success for safe methods
h.handleSuccess(w, r)
return
}
I offer you to move this check to the top of the ServeHTTP
function as much as possible, so nosurf can avoid doing useless operations. Performance will be increased
Hi,
Thanks for this great package.
To test this I did the following, I'm using JSON on the responses and requests:
I have 2 handlers, both on root "/", one is a GET and the other a POST.
On the GET I only return a token.
On the POST I verify the token and send a new one, like:
...
// Get token from JSON into jd.
tkn := nosurf.Token(r)
if !nosurf.VerifyToken(tkn, jd.Token) {
w.WriteHeader(http.StatusBadRequest)
w.Write([]byte(`{"message":"Different tokens"}`))
return
}
// Send the new token.
w.Write([]byte(fmt.Sprintf(`{"token":"%s"}`, tkn)))
What I would like to know if it is the normal behavior is that after getting
the token from the GET I can do all the requests to POST with that first token
that it always validates ok. I can even do e.g. 10 request to POST with the
first token, next do another request to POST with the newest token sent by
POST, then again start using the first token and it still validates.
Thanks for your help.
I am trying to exempt a few routes from csrf but noticed a "csrf_token" cookie still gets generated on those routes. Doesn't seem necessary to have that cookie on exempted routes. Also, is that cookie necessary after a form has been successfully transmitted?
An example with only 1 route that is supposed to be exempted from csrf tokens:
package main
import (
"github.com/gorilla/mux"
"github.com/justinas/nosurf"
"log"
"net/http"
)
type Routes []Route
type Route struct {
Method string
Pattern string
HandlerFunc http.HandlerFunc
}
func mainHandler(w http.ResponseWriter, r *http.Request) {
}
func main() {
var routes = Routes{
Route{"GET", "/mypath", mainHandler},
}
router := mux.NewRouter().StrictSlash(true)
for _, route := range routes {
handler := route.HandlerFunc
router.Methods(route.Method).Path(route.Pattern).Handler(handler)
}
// csrf protection
csrfHandler := nosurf.New(router)
csrfHandler.ExemptPath("/mypath")
port := ":8080"
log.Println("Listening at", port)
log.Fatal(http.ListenAndServe(port, csrfHandler))
}
I'm using some like that
handler := nosurf.New(cleanHandler)
handler.ExemptRegexps("/css(.)", "/js(.)", "/images(.*)")
Exempting my assets, but seems don't work:
Request URL:http://192.168.237.131/js/bootstrap.min.js
Request Method:GET
Status Code:200 OK
Request Headersview source
Accept:/
Accept-Encoding:gzip,deflate,sdch
Accept-Language:es,en;q=0.8,en-CA;q=0.6
Connection:keep-alive
Cookie:csrf_token=WK6UlEqLP3ioDLsUhuQTc1ZZ08DujAS5Gbxv0G2Riow=; _ga=GA1.1.1561828371.1415760157; session=MTQxOTg4Mzk2N3xfVmc3amc5OFh4RW04VUVjekhxLS16SEIwcEpyY0RUZW9EU3lodHdPSk4zUzdnTUpfYlFpR3l0dmM0a182Y0NTNVRMWE5TQ25fNWZhdzAwOHR5MjROYm5vNGoxdDRPNlA1V0FFdU5sZmQ5cm1HWVZidHk4bUg3aDBzVDBwQUhXSFNQb1JlRjdGTndCbms2UTJCN0liM0ZMR0dyRjMyYUlKSWxUVjU3NlhZVWUzaDNsMlZGczJrcnlsd0V5ZVM5SG9pc3RRVjdINk9RRy1PY245aGlkZTdRSnJncWJZelBLT196cHIwSUM0OVVUQThsNXB6NHVOS2g0PXzY2_Q54s1zOKcqBe5NimAmarqUBGrgq6LsWp1kQ28QZg==; flash=MTQxOTg4Mzk3NXxEdi1EQkFFQ180UUFBUkFCRUFBQUJQLUVBQUE9fKrNyW2LmqkQYwTkI9cMXz3dRF2VVQQx2C0LNCx5_UNC
Host:192.168.237.131
Referer:http://192.168.237.131/login
User-Agent:Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
Response Headersview source
Accept-Ranges:bytes
Cache-Control:public, max-age=300
Content-Encoding:gzip
Content-Type:application/x-javascript
Date:Mon, 29 Dec 2014 20:30:43 GMT
Last-Modified:Thu, 29 Aug 2013 13:52:00 GMT
Transfer-Encoding:chunked
Vary:Cookie
Vary:Accept-Encoding
Would it be possible to change the defaultFailureHandler
so it includes a basic response body? Something like:
func defaultFailureHandler(w http.ResponseWriter, r *http.Request) {
http.Error(w, http.StatusText(FailureCode), FailureCode)
}
Instead of the current http.Error(w, "", FailureCode)
.
Using http.StatusText()
wouldn't leak any extra useful information to an attacker, but it would be a nicer and less confusing user (and developer) experience than just seeing a blank page on failure.
I'm very happy to send a PR for this, if you like.
I cannot get nosurf to work with enctype="multipart/form-data"
The failure error I get is this:
The CSRF token in the cookie doesn't match the one received in a form/header.
Here is the simple.go example modified to use enctype="multipart/form-data"
https://gist.github.com/bryanjeal/448cc89b30643f610315
I'm newbie in everything.
I would like to write a Wiki page for newbies like me. Would you mind, @justinas?
Title: "How to use nosurf with external Single Page Application (SPA) like Ember, React, Angular or jQuery Ajax".
My app is both SPA and server rendered: authentication (using authboss - https://github.com/volatiletech/authboss) and I also have the Javascript part, so I need both the "JSON API" endpoint CSRF protected and the CSRF form values for authboss and something else server rendered.
I'm using chi router
(https://github.com/go-chi/chi) like this:
package main
import (
"net/http"
"github.com/go-chi/chi"
)
func main() {
r := chi.NewRouter()
r.Use(nosurfing, addCookie)
r.Get("/", func(w http.ResponseWriter, r *http.Request) {
w.Write([]byte("welcome"))
})
http.ListenAndServe(":3000", r)
}
func Nosurfing(h http.Handler) http.Handler {
surfing := nosurf.New(h)
surfing.SetBaseCookie(http.Cookie{Path: "/"}) //using this just because I don't know if it's right to create a cookie for every "sub-path" like "/auth" or "/api"; I opened an issue for clarify this: https://github.com/justinas/nosurf/issues/53
surfing.SetFailureHandler(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
log.Println("Failed to validate CSRF token:", nosurf.Reason(r))
w.WriteHeader(http.StatusBadRequest)
}))
return surfing
}
func addCookie(handler http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
cookie := &http.Cookie{Name: "nosurf_cookie_for_ajax", Value: nosurf.Token(r), Path: "/"} //using "Path: "/"" here just because I don't know if it's right to create a cookie for every "sub-path" like "/auth" or "/api"; I opened an issue for clarify this: https://github.com/justinas/nosurf/issues/53
http.SetCookie(w, cookie)
handler.ServeHTTP(w, r)
})
}
Now every time I visit a route it creates a cookie named csrf_token
and one named nosurf_cookie_for_ajax
.
The first one is masked and should be set HTTPOnly
and the second one is different everytime and to be read from javascript which has to use it in every POST (or CSRF protected) call with request header named X-CSRF-Token
.
Am I right?
The second cookie can be created maybe only if the a user is logged in (if you need this).
I think it's good to write here also the default security settings for cookies:
type Cookie struct {
Name string // For cookie 1 I would use the default value. For 2 I can call it "X-CSRF-Token" as Request header needed.
Value string // For cookie 1 default, for 2 I can use `nosurf.Token(r)`
Path string // 1: "/", 2: "/" but I need to understand better this behaviour, I opened an issue about: https://github.com/justinas/nosurf/issues/53
Domain string // default, like Path I think
MaxAge int // here I have some doubts, I think I will leave the nosurf's default for both
Secure bool // true (be careful if you are @ localhost)
HttpOnly bool // true for 1 and false for 2
SameSite SameSite // up to you, study it, helps for csrf problems
}
/csrf
for tokensIt seems like the tokens sent out in cookies are never masked. They are masked before being stored in the context, but then the unmasked token is sent out in the cookie. That seems incorrect to me, but I'm not a crypto expert.
This is based trying to use this library, and based on reading: https://github.com/justinas/nosurf/blob/master/handler.go#L182
When compiled in Go 1.7, nosurf.Token(r) returns empty string.
I've tried the -gcflags=-ssa=0 as suggested by the 1.7 release note, but it didn't help.
https://golang.org/doc/go1.7
math/rand
is cryptographically insecure and thus isn't suitable for token generation. We should switch to only using crypto/rand
.
This should be an easy, non-breaking change.
Are there plans to have https://github.com/julienschmidt/httprouter compatibility?
Thanks in advance.
I am relatively new to back end programming and Golang, but I thought it'd be a good idea to post this. I am able to "authenticate" requests on the example code, by passing the csrf_token
cookie and form data to cURL like this:
curl http://localhost:8000 -d "name=abcd&csrf_token=dEdKyAmXFbvNZGcWvVcBQVAb8IlVwS10SAFqwSQ/k7IkMvQbmRzMHV4M5V197UPycAEOncxxler1It9TtHbpiA==" --cookie "csrf_token=UHW+05CL2aaTaIJLwLpCsyAa/hSZsLievSO1kpBJejo="
and then the response is
<!doctype html>
<html>
<body>
<p>Your name: abcd</p>
<form action="/" method="POST">
<input type="text" name="name">
<input type="hidden" name="csrf_token" value="U4/bPZUAKZ+wezr8YcWkdEpsJ+2gnLt6UPPXzIhXAKYD+mXuBYvwOSMTuLehf+bHanbZ+TksA+Tt0GJeGB56nA==">
<input type="submit" value="Send">
</form>
</body>
</html>
I'm not completely sure if this is an expected behavior or a flaw.
How would I log when somebody does send a request with a bad token?
Background: The current system of open reads of the CSRF cookie by Javascript also means that valid CSRF tokens can be stolen and re-used infinitely(?) assuming 1) the route does not require auth/session data or 2) a valid session was also stolen.
Thought experiment: I'm contemplating changes to combine the two separate cookies (session and CSRF) into a single cookie. This would reduce the load on the randomness source and reduce bandwidth between the server and client.
Currently however, the CSRF is Javascript-readable which is an attack vector for the session. To prevent malicious client-side javascript from stealing the OTP CSRF token (and thereby knowing how to generate the true token to compare with it) two changes seem required:
httpOnly
, Strict, and Secure (HTTPS-only)OTP+OTP(Token)
should be hashed with server secret OTP+OTP(hash(secrete, token))
This means that if Javascript were to retrieve/steal the form/header token they would
be unable to construct the original HTTP-only cookie token for the session itself. (This is important if the CSRF cookie was also used as the session cookie)
Calling RegenerateToken()
in a request context where the client is not sending a CSRF cookie, two CSRF cookies will be generated:
map[Set-Cookie:[csrf_token=aZA5CKCpmzGwlyfyFZp1akOOo4dSbZEdSAziaN+nRYE=; Path=/; Domain=example.com; Max-Age=31536000; HttpOnly; Secure csrf_token=xe/JUh5YavyzQtmIqU018swoHmPN5nQsTSqSJscKJU4=; Path=/; Domain=example.com; Max-Age=31536000; HttpOnly; Secure] Vary:[Cookie]]
Depending on the order of the browser stores the cookie, this can lead to false-positive CSRF detection.
------ context_legacy.go --------- (good)
func Token(req *http.Request) string {
cmMutex.RLock()
defer cmMutex.RUnlock()
ctx, ok := contextMap[req]
if !ok {
return ""
}
return ctx.token
}
---------- context.go ---------------------(error)
// Token takes an HTTP request and returns
// the CSRF token for that request
// or an empty string if the token does not exist. <----------- ERROR
//
func Token(req *http.Request) string {
ctx := req.Context().Value(nosurfKey).(*csrfContext)
return ctx.token
}
...an empty string if the token does not exist. --> panic occurred !!
PANIC: interface conversion: interface {} is nil, not *nosurf.csrfContext
sorry for poor english.
Hi,
My code:
func (mr *Router) Handle(method string, path string, handler http.Handler) {
mr.router.Handler(method, path,
csrfHandler(
logHandler(logger.Debug)(
securityHeaderHandler()(
handler,
),
),
),
)
}
func csrfHandler(next http.Handler) http.Handler {
csrfHandler := nosurf.New(next)
csrfHandler.SetBaseCookie(http.Cookie{
Name: csrfCookieName, // "csrf"
Path: "/",
Domain: "",
Secure: true,
HttpOnly: true,
MaxAge: int(sessionTimeoutSec.Seconds()),
SameSite: http.SameSiteStrictMode,
})
return csrfHandler
}
However, every request without cookie returns:
$ curl -v http://localhost:8080/login 1>/dev/null
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8080 (#0)
> GET /login HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.54.0
> Accept: */*
>
< HTTP/1.1 200 OK
< access-control-allow-methods: GET,POST,HEAD,OPTIONS
< x-content-type-options: nosniff
< content-security-policy: default-src 'none'; script-src 'self'; style-src 'self'; img-src 'self' data:; font-src 'self' fonts.gstatic.com
< set-cookie: csrf_token=5BeFIyTJI/fipPPgEcdPcw4t4vTfOZDjYoOCt8/iRSI=; Path=/; Max-Age=900; HttpOnly; Secure; SameSite=Strict
Mostly curious. Wondering what level of security this extra obfuscation brings.
A HTTP Referer [sic] check was added in commit b1b164f for HTTPS sites. Its motivation is:
+ // if the request is secure, we enforce origin check
+ // for referer to prevent MITM of http->https requests
I doubt that this check will add much value. The token is not secret, it just needs to be unpredictable. Since httpOnly is not set by default, the cookie can also be leaked through XSS.
An open redirect vulnerability would also bypass this check.
There is at least one case where this breaks, when (Firefox) users disable the Referer header. What about using the Secure cookie flag instead? In that way, a different token will be sent to the http and https origins and even a passive adversary cannot see the token of the secure origin.
Related comment: #11 (comment)
Edit:
even with the Secure flag set, it would not help with existing cookies (which have a default lifetime of one year). Any cookies injected into a HTTP origin will also be sent to a HTTPS one.
A possible alternative for the referrer check here is the use of the SameSite attribute, although it will have the same problem as above (HTTP cookies can still be injected into HTTPS).
Assuming a passive adversary, a possible mitigation is the inclusion of the scheme in the cookie name or value. When the scheme does not match, treat it as invalid and send a new cookie (with the Secure flag set).
I'm only able to get a 400 Bad Request with POST/DELETE requests to my REST application.
Running an app in localhost I have this value in my csrf_token cookie:
0bYcWmFvvMpZXMSgau2Jx3uxQGhyfEtTxOEC6zrtlfs=
And the value passed back in X-CSRF-Token:
0bYcWmFvvMpZXMSgau2Jx3uxQGhyfEtTxOEC6zrtlfs=
My set up:
http.Handle("/", nosurf.New(myRoutes))
http.ListenAndServe(":"+port, nil)
I had this working fine until a recent go get update, so some kind of regression maybe?
The middleware should set a Vary: Cookie
header on CSRF protected pages to force proxies to never cache the page. This should, in most cases, override any Cache-Control
or ETag
headers set otherwise down the line.
The Django docs cover this nicely, as do the Varnish docs.
Note that in many cases good proxies won't cache the page anyway, but it's worth being sure.
PS: I can send a PR for this, but since it's effectively a one-liner prior to passing the handler, I figured it's probably easier for you to include in your next update/change.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.