fog / fog-backblaze Goto Github PK
View Code? Open in Web Editor NEWIntegration library for gem fog and Backblaze B2 Cloud Storage
License: MIT License
Integration library for gem fog and Backblaze B2 Cloud Storage
License: MIT License
Does this gem support files larger than 5GB? (i.e. large_files, https://www.backblaze.com/b2/docs/large_files.html)
I've tried uploading files at the 5GB file size and Backblaze server disconnects the connection.
by looking at the source code I saw that this gem read complete file before doing the upload and I'm worried how much memory this will acquire for large files.
fog-backblaze/lib/fog/backblaze/storage/real.rb
Lines 183 to 185 in f45d376
any otherway to improve this, like using stream/chunked upload?
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'backblaze',
b2_account_id: ENV['BACKBLAZE_ACCOUNT_ID'],
b2_account_token: ENV['BACKBLAZE_ACCOUNT_TOKEN']
}
config.fog_public = false
end
# Public
fog_public = true
object.file.url
# https://f001.backblazeb2.com/file/bucket/uploads/file.jpg
# Private
With fog_public = false
object.file.url
# /uploads/file.jpg
Backblaze's B2 API supports an optional HTTP request header during API transactions, X-Bz-Test-Mode
. By setting this header to certain specific values, a developer is able to trigger intermittent failures at different stages in a session, enabling deeper testing of their error handling.
https://www.backblaze.com/b2/docs/integration_checklist.html
In debugging why Backblaze wasn't honouring the content_disposition
I was passing to put_object
I noticed that its Hash#merge
is taking in a Hash that's indexed by symbols (the {'foo': 'bar'}
syntax) and merging in a hash that's indexed by strings ({'foo' => 'bar'}
) and as a result, user-defined Content-Type
doesn't override the default b2/x-auto
This behaviour can be seen in the logger.debug
output from the request body where there's both :"Content-Type"
and "Content-Type"
, then look at the response contentType
of application-octet
stream which indicates that b2/x-auto
had been used.
You will also notice :Authorization
and "Authorization"
headers which is the result of the same mis-match between the use of the symbol :Authorization
in put_object
and the use of a string Hash index in b2_command
.
(Formatting is my own to make the lines easier to read)
# Request body
{:body=>"-- Body 597185 bytes --",
:headers=>{:Authorization=>"<redacted>",
:"Content-Type"=>"b2/x-auto",
:"X-Bz-File-Name"=>"<redacted>",
:"X-Bz-Content-Sha1"=>"857b96203ddfa5c50052c003a3db08d20cb2fc35",
"Content-Type"=>"text/plain",
"X-Bz-Info-src_last_modified_millis"=>1563222099031,
"Authorization"=>"<redacted>"}}
# API response
{"accountId"=>"b8597a332ec1",
"action"=>"upload",
"bucketId"=>"<redacted>",
"contentLength"=>597185,
"contentSha1"=>"857b96203ddfa5c50052c003a3db08d20cb2fc35",
"contentType"=>"application/octet-stream",
"fileId"=>"<redacted>",
"fileInfo"=>{"src_last_modified_millis"=>"1563222099031"},
"fileName"=>"<redacted>",
"uploadTimestamp"=>1563236520000}
fog-backblaze/lib/fog/backblaze/storage/real.rb
Lines 214 to 219 in f45d376
When using CarrierWave, I get this error when uploading. The file successfully uploads, but it looks like Carrierwave is trying to cache the uploaded file on disk after the upload.
undefined method `copy_object' for #<Fog::Backblaze::Storage::Real>
carrierwave (2.1.0) lib/carrierwave/storage/fog.rb:453:in `copy_to'
carrierwave (2.1.0) lib/carrierwave/storage/fog.rb:340:in `store'
carrierwave (2.1.0) lib/carrierwave/storage/fog.rb:86:in `store!'
carrierwave (2.1.0) lib/carrierwave/uploader/store.rb:66:in `block in store!'
carrierwave (2.1.0) lib/carrierwave/uploader/callbacks.rb:15:in `with_callbacks'
carrierwave (2.1.0) lib/carrierwave/uploader/store.rb:65:in `store!'
carrierwave (2.1.0) lib/carrierwave/mounter.rb:105:in `each'
carrierwave (2.1.0) lib/carrierwave/mounter.rb:105:in `store!'
carrierwave (2.1.0) lib/carrierwave/mount.rb:401:in `store_document!'
...
When I monkey patch Fog::Backblaze::Storage::Real
like so:
def copy_object(*args)
p args
end
The output is:
["my-bucket", "uploads/tmp/filename", "my-bucket", "uploads/filename", {}]
"uploads/tmp/filename" looks to be a local path and "uploads/filename" looks to be the remote path in the bucket.
I am building a Rails 4 Application with Carrierwave and Backblaze B2 via Fog.
Locally, with the Carrierwave Storage set to :file it works but apparently not on production using fog storage.. can u help me with this?
class MicropostsController < ApplicationController
before_action :logged_in_user, only: [:create, :destroy]
before_action :correct_user, only: :destroy
def create
@micropost = current_user.microposts.build(micropost_params)
if @micropost.save
flash[:success] = "Micropost created!"
redirect_to root_url
else
@feed_items = []
render 'static_pages/home'
end
end
best regards,
ben
It seems that there's a newish depreciation warning with the latest Fog.
[fog][DEPRECATION] Unable to load Fog::Backblaze::Storage
[fog][DEPRECATION] The format Fog::Storage::Backblaze is deprecated
It appears the gem has not been published yet. When do you plan to do that?
Could not find a valid gem 'fog-backblaze' (>= 0) in any repository
Thanks!
The first call to b2_account_id
, which is triggered by most everything, will cause the JSON response from b2_authorize_account
to be printed to the console. This behaviour seems to have been some debugging code that was accidentally left in.
I seem to be getting the following error when submitting my form for my model:
no implicit conversion of ActionDispatch::Http::UploadedFile into String
Some details of my setup are below:
Rails 6.0.2.2
Ruby 2.6.0p0
gem 'fog-backblaze', '> 0.3.0’> 2.0', '>= 2.0.1’
gem 'shrine-fog', '
gem 'shrine', '~> 3.2', '>= 3.2.1'
Model
class Show < ApplicationRecord
include ImageUploader::Attachment(:artwork)
end
Image Uploader
class ImageUploader < Shrine
plugin :store_dimensions
Attacher.validate do
validate_max_size 10*1024*1024
validate_mime_type %w[image/jpeg image/png]
validate_extension %w[jpg jpeg png]
end
end
Shrine.rb
require "shrine"
require "shrine/storage/fog"
require "fog/backblaze"
storeb2 = Fog::Storage.new(
provider: 'backblaze',
b2_key_id: ‘###',
b2_key_token: ‘###',
b2_bucket_name: ’test',
b2_bucket_id: ‘###'
)
Shrine.storages[:store] = Shrine::Storage::Fog.new(
connection: storeb2,
directory: “test",
)
Shrine.storages[:cache] = Shrine::Storage::Fog.new(
connection: storeb2,
directory: “test",
)
Shrine.plugin :activerecord
Shrine.plugin :cached_attachment_data
Shrine.plugin :restore_cached_data
Shrine.plugin :validation
Shrine.plugin :validation_helpers
Controller Code Where it is blowing up
def create
@show = Show.new(show_params)
if @show.save
flash[:notice] = "The new show has been added to your account"
redirect_to @show
else
render :new
end
end
private
def show_params
params.require(:show).permit(:title, :artwork)
end
Do you know what this could be?
my fog-b2.rb
CarrierWave.configure do |config|
config.fog_provider = 'fog/backblaze'
config.fog_credentials = {
provider: 'backblaze',
b2_key_id: Rails.application.credentials.dig(:backblaze, :b2_key_id),
b2_key_token: Rails.application.credentials.dig(:backblaze, :b2_key_token),
# optional, if you wanna see B2 requests in log
logger: Rails.logger,
b2_bucket_name: Rails.application.credentials.dig(:backblaze, :b2_bucket_name),
b2_bucket_id: Rails.application.credentials.dig(:backblaze, :b2_bucket_id)
}
config.fog_directory = 'mydirec'
config.asset_host = 'https://mycdn.site.com/file/path'
config.fog_public = false # true # default is true
#fog_attributes supports only: content_type, last_modified, content_disposition
#config.fog_attributes = { ... }
#this can help debugging in development
config.ignore_integrity_errors = false
config.ignore_processing_errors = false
config.ignore_download_errors = false
end
#CarrierWave.clean_cached_files!
When I exec
Model.avatar.recreate_versions!(:thumb)
ArgumentError: wrong number of arguments (given 3, expected 2)
.rbenv/versions/2.6.2/lib/ruby/gems/2.6.0/gems/fog-backblaze-0.2.0/lib/fog/storage/backblaze/real.rb:148:in `head_object'
I am trying to use the sample code
pp connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE").json
but it keep throwing
NoMethodError: undefined method `parse' for Fog::JSON:Module
Did you mean? parent
from ~/.rvm/gems/ruby-2.5.1/gems/fog-backblaze-0.1.1/lib/fog/backblaze/json_response.rb:12:in `json'
``
Fog::Errors::Error (Failed put_object, status = 503 {"code"=>"service_unavailable", "message"=>"c002_v0001111_t0059 is too busy", "status"=>503}):
When user upload any images to b2 bucket, errors occur sometimes.
I found this article
https://www.backblaze.com/b2/docs/integration_checklist.html
But I don't know what should i do. :-(
Name: excon
Version: 0.68.0
Advisory: CVE-2019-16779
URL: GHSA-q58g-455p-8vw9
Title: Race condition when using persistent connections
Solution: upgrade to >= 0.71.0
Just a heads up that you may want to look into updating dependencies :)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.