Armory 0.96.5 Released – Bitcoin Armory – Python-based ...

Bitcoin Armory 0.90 beta - fragmented backups(ssss), bare signatures, reduced RAM Usage, lowered fees, and 30 second load times!

submitted by SwedFTP to Bitcoin [link] [comments]

New to bitcoin, question for you guys. If i have $1000 US, what is the best and safest way to turn that into spendable bitcoin?

what website should i use, where do i store my bitcoins, how do i ensure they are protected, and anything else you guys would do. Thanks
submitted by DutchMastar to Bitcoin [link] [comments]

How do I stop an application running in the background (in Linux-Ubuntu)?

I have bitcoin Armory wallet installed. This application automatically opens the Bitcoin Core client when it (Armory) is started. However, when I close Armory down the Bitcoin Core continues to run in the background and I can see it to shut it down. You may well ask how I know it is running at all (as I can't see its icon) - well, if you know Bitcoin Core, you will know that it occupies a lot of RAM. Presently I have to shut down the pc and restart - this rids one of the Bitcoin Core. But I would be surprised if there is not a way to do this without shutting the pc down - perhaps through a terminal command? I should be most grateful for any help. Thanks, DB
submitted by Duncan1949 to Ubuntu [link] [comments]

Greg Maxwell /u/nullc (CTO of Blockstream) has sent me two private messages in response to my other post today (where I said "Chinese miners can only win big by following the market - not by following Core/Blockstream."). In response to his private messages, I am publicly posting my reply, here:

Greg Maxell nullc sent me 2 short private messages criticizing me today. For whatever reason, he seems to prefer messaging me privately these days, rather than responding publicly on these forums.
Without asking him for permission to publish his private messages, I do think it should be fine for me to respond to them publicly here - only quoting 3 phrases from them, namely: "340GB", "paid off", and "integrity" LOL.
There was nothing particularly new or revealing in his messages - just more of the same stuff we've all heard before. I have no idea why he prefers responding to me privately these days.
Everything below is written by me - I haven't tried to upload his 2 PMs to me, since he didn't give permission (and I didn't ask). The only stuff below from his 2 PMs is the 3 phrases already mentioned: "340GB", "paid off", and "integrity". The rest of this long wall of text is just my "open letter to Greg."
TL;DR: The code that maximally uses the available hardware and infrastructure will win - and there is nothing Core/Blockstream can do to stop that. Also, things like the Berlin Wall or the Soviet Union lasted for a lot longer than people expected - but, conversely, the also got swept away a lot faster than anyone expected. The "vote" for bigger blocks is an ongoing referendum - and Classic is running on 20-25% of the network (and can and will jump up to the needed 75% very fast, when investors demand it due to the inevitable "congestion crisis") - which must be a massive worry for Greg/Adam/Austin and their backers from the Bilderberg Group. The debate will inevitably be decided in favor of bigger blocks - simply because the market demands it, and the hardware / infrastructure supports it.
Hello Greg Maxwell nullc (CTO of Blockstream) -
Thank you for your private messages in response to my post.
I respect (most of) your work on Bitcoin, but I think you were wrong on several major points in your messages, and in your overall economic approach to Bitcoin - as I explain in greater detail below:
Correcting some inappropriate terminology you used
As everybody knows, Classic or Unlimited or Adaptive (all of which I did mention specifically in my post) do not support "340GB" blocks (which I did not mention in my post).
It is therefore a straw-man for you to claim that big-block supporters want "340GB" blocks. Craig Wright may want that - but nobody else supports his crazy posturing and ridiculous ideas.
You should know that what actual users / investors (and Satoshi) actually do want, is to let the market and the infrastructure decide on the size of actual blocks - which could be around 2 MB, or 4 MB, etc. - gradually growing in accordance with market needs and infrastructure capabilities (free from any arbitrary, artificial central planning and obstructionism on the part of Core/Blockstream, and its investors - many of whom have a vested interest in maintaining the current debt-backed fiat system).
You yourself (nullc) once said somewhere that bigger blocks would probably be fine - ie, they would not pose a decentralization risk. (I can't find the link now - maybe I'll have time to look for it later.) I found the link:
I am also surprised that you now seem to be among those making unfounded insinuations that posters such as myself must somehow be "paid off" - as if intelligent observers and participants could not decide on their own, based on the empirical evidence, that bigger blocks are needed, when the network is obviously becoming congested and additional infrastructure is obviously available.
Random posters on Reddit might say and believe such conspiratorial nonsense - but I had always thought that you, given your intellectual abilities, would have been able to determine that people like me are able to arrive at supporting bigger blocks quite entirely on our own, based on two simple empirical facts, ie:
  • the infrastructure supports bigger blocks now;
  • the market needs bigger blocks now.
In the present case, I will simply assume that you might be having a bad day, for you to erroneously and groundlessly insinuate that I must be "paid off" in order to support bigger blocks.
Using Occam's Razor
The much simpler explanation is that bigger-block supporters believe will get "paid off" from bigger gains for their investment in Bitcoin.
Rational investors and users understand that bigger blocks are necessary, based on the apparent correlation (not necessarily causation!) between volume and price (as mentioned in my other post, and backed up with graphs).
And rational network capacity planners (a group which you should be in - but for some mysterious reason, you're not) also understand that bigger blocks are necessary, and quite feasible (and do not pose any undue "centralization risk".)
As I have been on the record for months publicly stating, I understand that bigger blocks are necessary based on the following two objective, rational reasons:
  • because I've seen the graphs; and
  • because I've seen the empirical research in the field (from guys like Gavin and Toomim) showing that the network infrastructure (primarily bandwidth and latency - but also RAM and CPU) would also support bigger blocks now (I believe they showed that 3-4MB blocks would definitely work fine on the network now - possibly even 8 MB - without causing undue centralization).
Bigger-block supporters are being objective; smaller-block supporters are not
I am surprised that you no longer talk about this debate in those kind of objective terms:
  • bandwidth, latency (including Great Firewall of China), RAM, CPU;
  • centralization risk
Those are really the only considerations which we should be discussing in this debate - because those are the only rational considerations which might justify the argument for keeping 1 MB.
And yet you, and Adam Back adam3us, and your company Blockstream (financed by the Bilderberg Group, which has significant overlap with central banks and the legacy, debt-based, violence-backed fiat money system that has been running and slowing destroying our world) never make such objective, technical arguments anymore.
And when you make unfounded conspiratorial, insulting insinuations saying people who disagree with you on the facts must somehow be "paid off", then you are now talking like some "nobody" on Reddit - making wild baseless accusations that people must be "paid off" to support bigger blocks, something I had always thought was "beneath" you.
Instead, Occams's Razor suggests that people who support bigger blocks are merely doing so out of:
  • simple, rational investment policy; and
  • simple, rational capacity planning.
At this point, the burden is on guys like you (nullc) to explain why you support a so-called scaling "roadmap" which is not aligned with:
  • simple, rational investment policy; and
  • simple, rational capacity planning
The burden is also on guys like you to show that you do not have a conflict of interest, due to Blockstream's highly-publicized connections (via insurance giant AXA - whose CED is also the Chairman of the Bilderberg Group; and companies such as the "Big 4" accounting firm PwC) to the global cartel of debt-based central banks with their infinite money-printing.
In a nutshell, the argument of big-block supporters is simple:
If the hardware / network infrastructure supports bigger blocks (and it does), and if the market demands it (and it does), then we certainly should use bigger blocks - now.
You have never provided a counter-argument to this simple, rational proposition - for the past few years.
If you have actual numbers or evidence or facts or even legitimate concerns (regarding "centralization risk" - presumably your only argument) then you should show such evidence.
But you never have. So we can only assume either incompetence or malfeasance on your part.
As I have also publicly and privately stated to you many times, with the utmost of sincerity: We do of course appreciate the wealth of stellar coding skills which you bring to Bitcoin's cryptographic and networking aspects.
But we do not appreciate the obstructionism and centralization which you also bring to Bitcoin's economic and scaling aspects.
Bitcoin is bigger than you.
The simple reality is this: If you can't / won't let Bitcoin grow naturally, then the market is going to eventually route around you, and billions (eventually trillions) of investor capital and user payments will naturally flow elsewhere.
So: You can either be the guy who wrote the software to provide simple and safe Bitcoin scaling (while maintaining "reasonable" decentralization) - or the guy who didn't.
The choice is yours.
The market, and history, don't really care about:
  • which "side" you (nullc) might be on, or
  • whether you yourself might have been "paid off" (or under a non-disclosure agreement written perhaps by some investors associated the Bilderberg Group and the legacy debt-based fiat money system which they support), or
  • whether or not you might be clueless about economics.
Crypto and/or Bitcoin will move on - with or without you and your obstructionism.
Bigger-block supporters, including myself, are impartial
By the way, my two recent posts this past week on the Craig Wright extravaganza...
...should have given you some indication that I am being impartial and objective, and I do have "integrity" (and I am not "paid off" by anybody, as you so insultingly insinuated).
In other words, much like the market and investors, I don't care who provides bigger blocks - whether it would be Core/Blockstream, or Bitcoin Classic, or (the perhaps confusingly-named) "Bitcoin Unlimited" (which isn't necessarily about some kind of "unlimited" blocksize, but rather simply about liberating users and miners from being "limited" by controls imposed by any centralized group of developers, such as Core/Blockstream and the Bilderbergers who fund you).
So, it should be clear by now I don't care one way or the other about Gavin personally - or about you, or about any other coders.
I care about code, and arguments - regardless of who is providing such things - eg:
  • When Gavin didn't demand crypto proof from Craig, and you said you would have: I publicly criticized Gavin - and I supported you.
  • When you continue to impose needless obstactles to bigger blocks, then I continue to criticize you.
In other words, as we all know, it's not about the people.
It's about the code - and what the market wants, and what the infrastructure will bear.
You of all people should know that that's how these things should be decided.
Fortunately, we can take what we need, and throw away the rest.
Your crypto/networking expertise is appreciated; your dictating of economic parameters is not.
As I have also repeatedly stated in the past, I pretty much support everything coming from you, nullc:
  • your crypto and networking and game-theoretical expertise,
  • your extremely important work on Confidential Transactions / homomorphic encryption.
  • your desire to keep Bitcoin decentralized.
And I (and the network, and the market/investors) will always thank you profusely and quite sincerely for these massive contributions which you make.
But open-source code is (fortunately) à la carte. It's mix-and-match. We can use your crypto and networking code (which is great) - and we can reject your cripple-code (artificially small 1 MB blocks), throwing it where it belongs: in the garbage heap of history.
So I hope you see that I am being rational and objective about what I support (the code) - and that I am also always neutral and impartial regarding who may (or may not) provide it.
And by the way: Bitcoin is actually not as complicated as certain people make it out to be.
This is another point which might be lost on certain people, including:
And that point is this:
The crypto code behind Bitcoin actually is very simple.
And the networking code behind Bitcoin is actually also fairly simple as well.
Right now you may be feeling rather important and special, because you're part of the first wave of development of cryptocurrencies.
But if the cryptocurrency which you're coding (Core/Blockstream's version of Bitcoin, as funded by the Bilderberg Group) fails to deliver what investors want, then investors will dump you so fast your head will spin.
Investors care about money, not code.
So bigger blocks will eventually, inevitably come - simply because the market demand is there, and the infrastructure capacity is there.
It might be nice if bigger blocks would come from Core/Blockstream.
But who knows - it might actually be nicer (in terms of anti-fragility and decentralization of development) if bigger blocks were to come from someone other than Core/Blockstream.
So I'm really not begging you - I'm warning you, for your own benefit (your reputation and place in history), that:
Either way, we are going to get bigger blocks.
Simply because the market wants them, and the hardware / infrastructre can provide them.
And there is nothing you can do to stop us.
So the market will inevitably adopt bigger blocks either with or without you guys - given that the crypto and networking tech behind Bitcoin is not all that complex, and it's open-source, and there is massive pent-up investor demand for cryptocurrency - to the tune of multiple billions (or eventually trillions) of dollars.
It ain't over till the fat lady sings.
Regarding the "success" which certain small-block supports are (prematurely) gloating about, during this time when a hard-fork has not happened yet: they should bear in mind that the market has only begun to speak.
And the first thing it did when it spoke was to dump about 20-25% of Core/Blockstream nodes in a matter of weeks. (And the next thing it did was Gemini added Ethereum trading.)
So a sizable percentage of nodes are already using Classic. Despite desperate, irrelevant attempts of certain posters on these forums to "spin" the current situation as a "win" for Core - it is actually a major "fail" for Core.
Because if Core/Blocksteam were not "blocking" Bitcoin's natural, organic growth with that crappy little line of temporary anti-spam kludge-code which you and your minions have refused to delete despite Satoshi explicitly telling you to back in 2010 ("MAX_BLOCKSIZE = 1000000"), then there would be something close to 0% nodes running Classic - not 25% (and many more addable at the drop of a hat).
This vote is ongoing.
This "voting" is not like a normal vote in a national election, which is over in one day.
Unfortunately for Core/Blockstream, the "voting" for Classic and against Core is actually two-year-long referendum.
It is still ongoing, and it can rapidly swing in favor of Classic at any time between now and Classic's install-by date (around January 1, 2018 I believe) - at any point when the market decides that it needs and wants bigger blocks (ie, due to a congestion crisis).
You know this, Adam Back knows this, Austin Hill knows this, and some of your brainwashed supporters on censored forums probably know this too.
This is probably the main reason why you're all so freaked out and feel the need to even respond to us unwashed bigger-block supporters, instead of simply ignoring us.
This is probably the main reason why Adam Back feels the need to keep flying around the world, holding meetings with miners, making PowerPoint presentations in English and Chinese, and possibly also making secret deals behind the scenes.
This is also why Theymos feels the need to censor.
And this is perhaps also why your brainwashed supporters from censored forums feel the need to constantly make their juvenile, content-free, drive-by comments (and perhaps also why you evidently feel the need to privately message me your own comments now).
Because, once again, for the umpteenth time in years, you've seen that we are not going away.
Every day you get another worrisome, painful reminder from us that Classic is still running on 25% of "your" network.
And everyday get another worrisome, painful reminder that Classic could easily jump to 75% in a matter of days - as soon as investors see their $7 billion wealth starting to evaporate when the network goes into a congestion crisis due to your obstructionism and insistence on artificially small 1 MB blocks.
If your code were good enough to stand on its own, then all of Core's globetrotting and campaigning and censorship would be necessary.
But you know, and everyone else knows, that your cripple-code does not include simple and safe scaling - and the competing code (Classic, Unlimited) does.
So your code cannot stand on its own - and that's why you and your supporters feel that it's necessary to keep up the censorship and and the lies and the snark. It's shameful that a smart coder like you would be involved with such tactics.
Oppressive regimes always last longer than everyone expects - but they also also collapse faster than anyone expects.
We already have interesting historical precedents showing how grassroots resistance to centralized oppression and obstructionism tends to work out in the end. The phenomenon is two-fold:
  • The oppression usually drags on much longer than anyone expects; and
  • The liberation usually happens quite abruptly - much faster than anyone expects.
The Berlin Wall stayed up much longer than everyone expected - but it also came tumbling down much faster than everyone expected.
Examples of opporessive regimes that held on surprisingly long, and collapsed surpisingly fast, are rather common - eg, the collapse of the Berlin Wall, or the collapse of the Soviet Union.
(Both examples are actually quite germane to the case of Blockstream/Core/Theymos - as those despotic regimes were also held together by the fragile chewing gum and paper clips of denialism and censorship, and the brainwashed but ultimately complacent and fragile yes-men that inevitably arise in such an environment.)
The Berlin Wall did indeed seem like it would never come down. But the grassroots resistance against it was always there, in the wings, chipping away at the oppression, trying to break free.
And then when it did come down, it happened in a matter of days - much faster than anyone had expected.
That's generally how these things tend to go:
  • oppression and obstructionism drag on forever, and the people oppressing freedom and progress erroneously believe that Core/Blockstream is "winning" (in this case: Blockstream/Core and you and Adam and Austin - and the clueless yes-men on censored forums like r\bitcoin who mindlessly support you, and the obedient Chinese miners who, thus far, have apparently been to polite to oppose you) ;
  • then one fine day, the market (or society) mysteriously and abruptly decides one day that "enough is enough" - and the tsunami comes in and washes the oppressors away in the blink of an eye.
So all these non-entities with their drive-by comments on these threads and their premature gloating and triumphalism are irrelevant in the long term.
The only thing that really matters is investors and users - who are continually applying grassroots pressure on the network, demanding increased capacity to keep the transactions flowing (and the price rising).
And then one day: the Berlin Wall comes tumbling down - or in the case of Bitcoin: a bunch of mining pools have to switch to Classic, and they will do switch so fast it will make your head spin.
Because there will be an emergency congestion crisis where the network is causing the price to crash and threatening to destroy $7 billion in investor wealth.
So it is understandable that your supports might sometimes prematurely gloat, or you might feel the need to try to comment publicly or privately, or Adam might feel the need to jet around the world.
Because a large chunk of people have rejected your code.
And because many more can and will - and they'll do in the blink of an eye.
Classic is still out there, "waiting in the wings", ready to be installed, whenever the investors tell the miners that it is needed.
Fortunately for big-block supporters, in this "election", the polls don't stay open for just one day, like in national elections.
The voting for Classic is on-going - it runs for two years. It is happening now, and it will continue to happen until around January 1, 2018 (which is when Classic-as-an-option has been set to officially "expire").
To make a weird comparison with American presidential politics: It's kinda like if either Hillary or Trump were already in office - but meanwhile there was also an ongoing election (where people could change their votes as often as they want), and the day when people got fed up with the incompetent incumbent, they can throw them out (and install someone like Bernie instead) in the blink of an eye.
So while the inertia does favor the incumbent (because people are lazy: it takes them a while to become informed, or fed up, or panicked), this kind of long-running, basically never-ending election favors the insurgent (because once the incumbent visibly screws up, the insurgent gets adopted - permanently).
Everyone knows that Satoshi explicitly defined Bitcoin to be a voting system, in and of itself. Not only does the network vote on which valid block to append next to the chain - the network also votes on the very definition of what a "valid block" is.
Go ahead and re-read the anonymous PDF that was recently posted on the subject of how you are dangerously centralizing Bitcoin by trying to prevent any votes from taking place:
The insurgent (Classic, Unlimited) is right (they maximally use available bandwidth) - while the incumbent (Core) is wrong (it needlessly throws bandwidth out the window, choking the network, suppressing volume, and hurting the price).
And you, and Adam, and Austin Hill - and your funders from the Bilderberg Group - must be freaking out that there is no way you can get rid of Classic (due to the open-source nature of cryptocurrency and Bitcoin).
Cripple-code will always be rejected by the network.
Classic is already running on about 20%-25% of nodes, and there is nothing you can do to stop it - except commenting on these threads, or having guys like Adam flying around the world doing PowerPoints, etc.
Everything you do is irrelevant when compared against billions of dollars in current wealth (and possibly trillions more down the road) which needs and wants and will get bigger blocks.
You guys no longer even make technical arguments against bigger blocks - because there are none: Classic's codebase is 99% the same as Core, except with bigger blocks.
So when we do finally get bigger blocks, we will get them very, very fast: because it only takes a few hours to upgrade the software to keep all the good crypto and networking code that Core/Blockstream wrote - while tossing that single line of 1 MB "max blocksize" cripple-code from Core/Blockstream into the dustbin of history - just like people did with the Berlin Wall.
submitted by ydtm to btc [link] [comments]

A Guide to Keeping Keys Offline Using Armory +rPi

Hi Redditors.
I am going to post in this thread my experiences in getting my Desktop (Debian) machine running Armory in watch-only mode, and coupling that with an offline Raspberry Pi (which holds my private keys) for signing the transactions previously made in watch-only mode.
I actually compiled Armory from source directly on my Pi. This guide is probably more for the bitcoin 'power user', as to run Armory online, and broadcast the signed transactions, you need to have a bitcoin full node running (bitcoind).
Basic requirements:
Aimed-for Setup:
I'll post the guide in digestible sections...

Section 1

I should begin by saying I installed source code from git, and got Armory to build the DB on my desktop initially, WITHOUT creating a wallet.. (This allowed me to debug what was going on a little!)
Go to, select Armory..
It leads to a Download from Git:
Followed the procedure for Linux Debian verify code, compile, install, all straight-forward..
Began by running bitcoind, and telling Armory where to find it. This is the command I used, obviously it was all on one line and didn't include the arrows/explanations!:
python \ --satoshi-datadir=/BlockChain/chain20180414/blocks \ # <-----(where my bitcoind blocks live) --datadir=/ArmoryDataDi \ # <-----(this is instead of ~/.armory) --dbdir=/ArmoryDataDidatabases # <-------(again, non std. place used for Armory's databases.. my choice.) 
So, on the Desktop, after the initial "build databases"
(NB the initial "Build Databases" took about 1.5h and my two CPUs were maxed the whole time, Temps up to 62C. Not ideal; Im not in a rush!)
I then wanted to import a watch-only wallet.
Before I did this, I took a full backup of the Armory data dir:
(or ~/.armory in a default installation).
I'd hate to have to make Armory do another full sync with the bitcoind node!

Section 2

Next step: offline wallet (with Private Keys) is on a Raspberry Pi.
I downloaded the source and managed to compile it on the pi itself! :)
Though there were some gymnastics needed to setup the Pi.
My Pi is running Raspbian based on Wheezy.. quite old!
I did the following on the Pi:
apt-get update apt-get upgrade (<---took about an hour!) apt-get install autotools-dev apt-get install autoconf 
Then I followed the instructions exactly as I had done for my Debian Desktop machine, EXCEPT:
I had to increase the Pi's swap space. I upped it from 100Mb to 400Mb.
The compilation took 7 hours, and my poor SD card got a thrashing.
But after compilation, I put the Swap back to 100Mb and Armory runs ok with about 150Mb of memory (no swap needed).
Swap increase on the Pi:
use your favourite editor, and open the file /etc/dphys-swapfile
add/change the following line:
Then, REBOOT the Pi:
sudo shutdown -h -P now 
Once the compilation was done on the Pi, put the swap back, rebooted and created an Armory wallet.
I added manual entropy and upped the encryption 'time' from 250ms to 2500ms - since the Pi is slow, but I'll be happy to wait for more iterations in the Key Derivation Function.
Once the wallet was created, it obviously prompts you for backup.
I want to add a private key of my own (i.e. import), so don't do the backup until this is over.
I import my Private Key, and Armory checks that this corresponds to a Public Key, which I check is correct.
This is the point now where the Pi storage medium (e.g an SD card) has to be properly destroyed if you ever get rid of it.
I had thought that now would be a good time to decide if your new wallet will generate Segwit receiving addresses, and also addresses used to receive 'change' after a transaction..
But it seems Armory WON'T let you switch to P2SH-P2WPKH unless your Armory is connected to a node offering "WITNESS" service.
Obviously, my Pi is offline and will never connect to a node, so the following will not work on the Pi:
NB: I thought about setting this on the Debian "watch-only" wallet, but that would surely mean doom, as the Pi would not know about those addresses and backups might not keep them.. who knows...
So, end result:- no segwit for me just yet in my offline funds.

--If anyone can offer a solution to this, I'd be very grateful--

Section 3

Ok, now this is a good point to back up your wallet on the Pi. It has your imported keys. I choose a Digital Backup - and put it on a USB key, which will never touch the internet and will be stored off-site. I also chose to encrypt it, because I'm good with passwords..
NB: The Armory paper backup will NOT back up your imported private keys, so keep those somewhere if you're not sweeping them. It would be prudent to have an Armory paper backup anyway, but remember it will likely NOT help you with that imported key.
Now for the watch-only copy of the wallet. I want to get the "watch-only" version onto my Desktop Debian machine.
On the Pi, I created (exported to a USB key) a "watching-only" copy of my wallet.
I would use the RECOMMENDED approach, export the "Entire Wallet File".
As you will see below, I initially exported only the ROOT data, which will NOT capture the watching-only part of the Private Key I entered manually above (i.e. the public Key!).
Now, back on the Debian Desktop machine...
I stopped all my crontab jobs; just give Armory uninterrupted CPU/memory/disk...
I also stopped bitcoind and made a backup prior to any watch-only wallet being imported.
I already made a backup of Armory on my Desktop, before any wallet import.
(this was needed, as I made a mistake.. see below)
So on the Debian Desktop machine, I begin by firing up bitcoind.
my command for this is:
./bitcoind -daemon -datadir=/BlockChain/chain20180414 -dbcache=400 -maxmempool=400 

Section 4

I try running Armory like this:
(I'm actually starting Armory from a script -
Inside the script, it has the line:
python --ram-usage=4 --satoshi-datadir=/BlockChain/chain20180414/blocks --datadir=/ArmoryDataDi --dbdir=/ArmoryDataDidatabases 
I know from bitter experience that doing a scan over the blockchain for a new wallet takes a looong time and a lot of CPU, and I'd like it to play nicely; not gobble all the memory and swap and run my 2xCPUs both at 100% for four hours...
So... I aim to run with --ram-usage=X and --thread-count=X
(For me in the end, X=1 but I began with X=4)
I began with --ram-usage=4 (<--- = 4x128Mb)
The result is below...
TypeError: cannot concatenate 'str' and 'int' objects 
It didn't recognise the ram-usage and carried on, crippling my Debian desktop PC.
This is where it gets dangerous; Armory can gobble so much memory and CPU that the windowing environment can cease up, and it can take over 30 minutes just to exit nicely from bitcoind and ArmoryDB.
So, I ssh to the machine from another computer, and keep an eye on it with the command
"free -h" 
I'd also be able to do a "sudo reboot now" if needed from here.

Section 5

So, trying to get my --ram-usage command recognised, I tried this line (added quotes):
python --ram-usage="4" --satoshi-datadir=/BlockChain/chain20180414/blocks --datadir=/ArmoryDataDi --dbdir=/ArmoryDataDidatabases 
But no, same error...
Loading Armory Engine: Armory Version: 0.96.4 Armory Build: None PyBtcWallet Version: 1.35 Detected Operating system: Linux OS Variant : ('debian', '9.4', '') User home-directory : /home/ Satoshi BTC directory : /BlockChain/chain20180414 Armory home dir : /ArmoryDataDi ArmoryDB directory : /ArmoryDataDidatabases Armory settings file : /ArmoryDataDiArmorySettings.txt Armory log file : /ArmoryDataDiarmorylog.txt Do wallet checking : True (ERROR) - Unsupported language specified. Defaulting to English (en) (ERROR) - Failed to start Armory database: cannot concatenate 'str' and 'int' objects Traceback (most recent call last): File "", line 1808, in startArmoryDBIfNecessary TheSDM.spawnDB(str(ARMORY_HOME_DIR), TheBDM.armoryDBDir) File "/BitcoinArmory/", line 387, in spawnDB pargs.append('--ram-usage=' + ARMORY_RAM_USAGE) TypeError: cannot concatenate 'str' and 'int' objects 

Section 6

So, I edit the Armory python file
if ARMORY_RAM_USAGE != -1: pargs.append('--ram-usage=4') #COMMENTED THIS, SO I CAN HARDCODE =4 # ' + ARMORY_RAM_USAGE) 
Running it, I now have acknowledgement of the --ram-usage=4:
(WARNING) - Spawning DB with command: /BitcoinArmory/ArmoryDB --db-type="DB_FULL" --cookie --satoshi-datadir="/BlockChain/chain20180414/blocks" --datadir="/ArmoryDataDi" --dbdir="/ArmoryDataDidatabases" --ram-usage=4 
Also, even with ram-usage=4, it used too much memory, so I told it to quit.
It took over 30 minutes to stop semi-nicely. The last thing it reported was:
ERROR - 00:25:21: (StringSockets.cpp:351) FcgiSocket::writeAndRead FcgiError: unexpected fcgi header version 
But that didn't seem to matter or corrupt the Armory Database, so I think it's ok.
So, I get brave and change as below, and I make sure my script has a command line for --ram-usage="ABCDE" and --thread-count="FGHIJ"; the logic being that these strings "ABCDE" will pass the IF criteria below, and my hardcoded values will be used...
if ARMORY_RAM_USAGE != -1: pargs.append('--ram-usage=1') #COMMENTED THIS, SO I CAN HARDCODE =1 # ' + ARMORY_RAM_USAGE) if ARMORY_THREAD_COUNT != -1 pargs.append('--thread-count=1') #COMMENTED THIS, SO I CAN HARDCODE =1 #' + ARMORY_THREAD_COUNT) 
So, as usual, I use my script and start this with: ./
(which uses command line:)
python --ram-usage="ABCDE" --thread-count="FGHIJ" --satoshi-datadir=/BlockChain/chain20180414/blocks --datadir=/ArmoryDataDi --dbdir=/ArmoryDataDidatabases 
(this forces it to use my hard-coded values in
So, this is the command which it reports that it starts with:
(WARNING) - Spawning DB with command: /BitcoinArmory/ArmoryDB --db-type="DB_FULL" --cookie --satoshi-datadir="/BlockChain/chain20180414/blocks" --datadir="/ArmoryDataDi" --dbdir="/ArmoryDataDidatabases" --ram-usage=1 --thread-count=1 
Again, this is where it gets dangerous; Armory can gobble so much memory and CPU that the windowing environment can cease up. So I ssh to the machine and keep an eye on it with:
"free -h" 

Section 7

So, on the Debian Desktop PC, I inserted the USB stick with the watch-only wallet I exported from the Pi.
Start Armory...
Import "Entire Wallet File" watch-only copy.
Wait 4 hours..
After running Armory for about 30m, the memory usage dropped by 400m... wierd...
It took ~2 hours to get 40% completion.
After 3.5 hours it's almost there...
The memory went up to about 1.7Gb in use and 900Mb of Swap, but the machine remained fairly responsive throughout, apart from a few (10?) periods at the start, where it appeared to freeze for 10-30s at a time.
(That's where my ssh session came in handy - I could check the machine was still ok with a "free -h" command)
Now, I can:
Create an unsigned transaction on my Desktop,
Save the tx to USB stick,
Move to the Pi,
Sign the tx,
Move back to the Desktop,
Broadcast the signed tx.

Section 8

My initial Mistake:
This caused me to have to roll-back my Armory database, using the backup. so you should try to avoid doing this..
On the Pi, I exported only the ROOT data, which will NOT capture the watching-only part of the Private Key
It is RECOMMENDED to use the Digital Export of Entire Wallet File from the Pi when making a watch-only copy. If you just export just the "ROOT data", not the "Entire Wallet File", you'll have problems if you used an imported Private Key in the offline wallet, like I did.
Using the ROOT data text import, after it finished... my balance was zero. So,. I tried a Help->Rescan Balance (Restart Armory, takes 1minute to get back up and running) No Luck. Still zero balance.
So, I try Rescan Databases.. This will take longer. Nah.. no luck.
So, I tried again, thinking it might be to do with the fact that I imported the text "root data" stuff, instead of following the (Recommended) export of watching-wallet file.
So, I used my Armory backup, and wound back the ArmoryDataDi to the point before the install of the (zero balance) wallet. (you should not need to do this, as you will hopefully use the RECOMMENDED approach of exporting the "Entire Wallet File"!)
submitted by fartinator to Bitcoin [link] [comments]

Bitcoin Core taking ages and using all my RAM

Hey all, so I'm using bitcoin core to download the blockchain in order to use Armory. However it's been going 4 days now and it's using 90%+ of my 16gig ram. What's going om, is this normal? My internet speed is good so that's not a problem.
submitted by bck_wrds to Bitcoin [link] [comments]

Is my offline wallet really secure? (Question about Truecrypt/Electrum/Armory)

Hello fellow bitcoiners,
with my bitcoins gaining more and more value, I'm now thinking of setting up a secure offline storage. I own a Asus EEE Netbook I don't really need that I can use. I haven't decided on the distro yet, but I might choose Crunchbang or Xubuntu, since those are the ones I know. For the offline wallet I'm going to use either Electrum or Armory, the wallet file itself will be encrypted with Truecrypt.
This should keep me safe against online attacks, since I plan to keep the computer offline after completing the setup and installing the latest updates.
But what about offline attacks? Are my Bitcoins safe when my computer gets stolen?
The wallet file is encrypted (twice), but since I don't really know how Truecrypt and Electrum/Armory work under the surface, I can't really be sure that my private keys are safe. So what I want to know is: Do Truecrypt/Electrum/Armory store any kind of sensible information on the hard disk, e.g. in temporary files or in the swapspace? What else do I have to consider?
Thank you very much!
submitted by Stueckmuenzen to Bitcoin [link] [comments]

The World Wide Web runs on webservers in datacenters. The World Wide Blockchain should also run on "blockservers" in datacenters. The "sweet spot" of Bitcoin scaling, reliability, security & convenience is *nodes in the cloud* + *private keys offline*. The is the future of Bitcoin. Let's embrace it.

Four-Line Summary
(1) Bitcoin nodes (and everyone's public addresses) should be online - in datacenters.
(2) Bitcoin wallets (and your private keys) should be offline - in your pocket.
(3) This architecture provides the optimal combination or "sweet spot" for short-term and long-term Bitcoin scaling, reliability, security & convenience.
(4) The best communications strategy is for us to embrace the approach of "nodes-in-datacenters" a/k/a "blockservers-in-the-cloud" - instead of apologizing for it.
Longer Summary
(1) Bitcoin nodes should be online - on "online public blockservers", ideally running on big, powerful webservers with high connectivity & high-end specs, in datacenters.
(2) Bitcoin private keys should be offline - in "offline private wallets", ideally running on tiny, cheap computers with no connectivity & low-end specs, in your pocket.
(3) We should embrace "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") and "keys-in-your-pocket" as the future of Bitcoin, providing the optimal combination (or "sweet spot") of scaling, reliability, security & convenience.
Bitcoin has been a success for 7 years and is continuing to grow and needs a simple and safe way to scale.
So, now it is time for people to embrace nodes-in-datacenters a/k/a blockservers-in-the-cloud (plus private keys offline - to enable 100% security with "offline signing of transactions") as Bitcoin's future.
(1) ...because everything on the web actually works this way already - providing the optimal combination of scaling, reliability, security & convenience.
  • You already keep your passwords for websites and webmail on you - usually physically offline (in your head, written on a slip of paper, or maybe in an offline file, etc.)
  • When was the last time you ran a server out of your home to continually spider and index terabytes of data for the entire web?
  • Why should you need to hold 60 GB of data (and growing) when you just want to check the balance of a single Bitcoin address (eg, one of your addresses)?
  • Bitcoin is still very young, and if in order to fulfill its earlier promise about banking the unbanked, microtransactions, DACs (decentralized autonomous corporations), IoT (Internet of Things), smart contracts, etc., then we should hope and expect that the blockchain will someday take up terabytes, not "mere" gigabytes - just like Google's giant search engine index, which they update every few minutes.
  • Do you really think you should be performing this kind of heavy-duty indexing, querying and "serving" on a low-end machine behind a low-end connection in your home, when companies like Google can do it so much better?
  • As long as you physically control your own private keys, who cares if you rely on or (or someday: or or to lookup up public information about balances and transactions on Bitcoin addresses?
  • They're not going to be able to lie to you. The meaning of "permissionless" and "decentralized" is that anybody can set up a full-node / "blockserver" (plus "blockchain search engines"), and anybody can (and will) immediately report it to the whole world if a website like or (or someday: or or provides false information - which would seriously damage their business, so they'll never do it.
(2) ...because webservers and webmail don't lie to you, and "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") aren't going to be able to lie to you either - since it would not be in their interest, and they would get caught if they did.
  • When was the last time or or or ( lied to you when you performed a search or looked up some news?
  • When was the last time or lied to you when you checked the balance at a Bitcoin address?
  • Currently, with billions of websites and news sources ("webservers") running around the world in datacenters, there are "web search engines" (eg, or or or where you can look up information and news on the World Wide Web. In order to survive, the business model of these "web search engines" is about getting lots of visitors, and providing you with reliable information. It's not in their best interests to lie - so they never do. These sites simply "spider" / "crawl" / "index" the entire massive web out there (every few minutes actually), and then conveniently filter / aggregate / present the results as a free service to you.
  • In the future, when there are 10,000 or 100,000 Bitcoin full-nodes ("blockservers") running around the world in datacenters, there will be "blockchain search engines" (eg, or or - just like we already have and, etc.) where you will be able to lookup transactions and balances on the World Wide Blockchain. In order to survive, their business model will be about getting lots of visitors, and providing you with reliable information. It's not going to be in their best interests to lie - so they never will. These sites will simply "spider" / "crawl" / "index" the entire massive blockchain out there (every few minutes actually), and then conveniently filter / aggregate / present the results as a free service to you.
  • The business model for "blockchain search engines" might eventually showing ads or sponsored content along with the Bitcoin blockchain search functions which we are primarily interested in. This would be quite usable and simple and safe, and similar to how most people already use sites like,,, etc.
(3) ...because "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") provide simple scaling now.
  • Nodes-in-the-cloud are the only solution which can provide scaling now - using existing, tested software - by simply adjusting - or totally eliminating - the MAXBLOCKSIZE parameter.
  • They can use existing, tested, reliable software: thousands of 2MB+ nodes are already running.
  • About 1,000 Classic nodes have been spun up in AWS ECS datacenters (Amazon Web Services - Elastic Computer Cloud) in the past month. (Uninformed yes-men at r\bitcoin try to spin this as a "bad thing" - but we should embrace it as a "good thing", explicitly espousing the philosophy outlined in this post.)
  • "Nodes-in-datacenters" (ie, "blockservers-in-the-cloud") can be flexibly and easily configured to provide all the scaling needed in terms of:
    • Bandwidth (throughput)
    • Hard drive space (storage)
    • RAM (memory)
    • CPU (processing power)
  • The yes-men and sycophants and authoritarians and know-nothings on the censored subreddit r\bitcoin are forever fantasizing about some Rube Goldberg vaporware with a catchy name "Lightning Network" which doesn't even exist, and which (at best, if it ever does come into existence) would be doomed to be slow, centralized and expensive. LN is a non-thing.
  • Those same people on the censored r\bitcoin forum are desperately trying to interpret the thousands of Classic nodes as a negative thing - and their beloved non-existent Lightning Network as a positive thing. This is the kind of typical down-is-up, black-is-white thinking that always happens in a censorship bubble - because the so-called Lightning Network isn't even a thing - while Classic is a reality.
(4) ...because "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") provide more reliability / availability.
  • 24/7/365 tech support,
  • automatic server reboots,
  • server uptime guarantees,
  • electrical power uptime guarantees.
(5) ...because "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") provide better security.
(6) ...because "nodes-in-datacenters" (ie, "blockservers-in-the-cloud") provide more convenience.
(7) ...because separating "full-node" functionality from "wallet" functionality by implementing "hierarchical deterministic (HD)" wallets is cleaner, safer and more user-friendly.
Armory, BIP 0032 provide "hierarchical deterministic (HD)" wallets.
  • "Hierarchical deterministic" wallets are required in order to be able to keep private keys offline, and "offline-sign" transactions. This is because a wallet needs to be "deterministic" in order to be able to generate the same sequence of random private keys in the offline wallet and the online wallet.
  • "Hierarchical deterministic (HD)" wallets are also required in order to allow a user to perform a single, one-time, permanent backup of their wallet - which lasts forever (since a HD wallet already deterministically "knows" the exact sequence of all the private keys which it will generate, now and in the future - unlike the antiquated wallet in Core / Blockstream's insecure, non-user-friendly Bitcoin implementation, which pre-generates keys non-deterministically in batches of 100 - so old backups of Core / Blockstream wallets could actually be missing later-generated private keys, rendering those backups useless).
  • Bitcoin is now over 7 years old, but Core / Blockstream has mysteriously failed to provide this simple, essential feature of HD wallets - while several other Bitcoin implementations have already provided this.
  • This feature is extremely simple, because it is all done entirely offline - not networking, no game theory, no non-deterministic behavior, no concurrency. The "HD wallet" functionality just needs some very basic, standard crypto and random-number libraries to generate a "seed" which determines the entire sequence of all the private keys which the wallet can generate.
  • Newer Bitcoin implementations (unlike Core / Blockstream) have now "modularized" their code, also separating "full-node" functionality from "wallet" functionality at the source code level:
  • in Golang - "btcsuite" from Conformal, providing "btcd" (node) and "btcwallet" (wallet):
  • in Haskell + MySQL/SQLite - "Haskoin":
  • There is also a Bitcoin implementation which provides only a full-node:
  • in Ruby + Postgres - "Toshi" from CoinBase:
  • [Tinfoil] The fact that Core / Blockstream has failed to provide HD and failed to clean up and modularize its messy spaghetti code - and the fact that Armory is now out of business (and both companies received millions of dollars in venture capital, and the lead dev of Armory left because the investors were creating needless obstacles regarding intellectual property rights, licensing, etc.) - these facts are suspicious because suggest that these corporations may be trying to discourage dev-friendliness, user-friendliness, security, convenience, and on-chain scaling.
(8) ...because the only thing most users really want and need is total physical control over their private keys.
  • Most people do not want or need to run a Bitcoin full-node, because:
    • A Bitcon full-node consumes lots of disk space and bandwidth, and can be expensive and complicated to set up, run, maintain, and secure.
    • A Bitcoin full-node requires an extremely high level of hardware and software security - which most computer users have never even attempted.
  • As Armory or Electrum users know, the simplest and safest way to provide 100% guaranteed security is by using "offline storage" or "cold storage" or "air gap".
  • In other words, ideally, you should never even let your private keys touch a device which has (or had) the hardware and/or software to go online - ie: no Wi-Fi, no 3G, and no Ethernet cable.
  • This offline machine is used only to generate private keys (where a Bitcoin private key is literally actually just any truly random number up to around 1078 ) - and also used to "offline-sign" transactions.
  • So it is simplest and safest if your private keys are on an offline machine which never can / did go online - and such as machine can be very cheap, because it really only needs to run some very basic random-number-generator and crypto libraries.
  • It would be simplest and safest for people to own a tiny cheap 100% secure offline computer to use only for:
    • generating / storing Bitcoin private keys
    • signing Bitcoin transactions
    • possibly also for generating / storing other kinds of private keys (other cryptocurrencies, GPG keys, etc.)
Four-Line Summary / Conclusion:
(1) Bitcoin nodes (and everyone's public addresses) should be online - in datacenters.
(2) Bitcoin wallets (and your private keys) should be offline - in your pocket.
(3) This architecture provides the optimal combination or "sweet spot" for short-term and long-term Bitcoin scaling, reliability, security & convenience.
(4) The best communications strategy is for us to embrace the approach of "nodes-in-datacenters" a/k/a "blockservers-in-the-cloud" - instead of apologizing for it.
submitted by ydtm to btc [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

Colored coins (decentralized exchange): we need developers to complete client implementation faster

Brief intro about colored coins (you can skip it if you already know): Colored coins can be used to encode information about ownership of external property (e.g. securities, tickets, etc) directly in blockchain; this enables trustless trading, i.e. securities represented with colored coins can be bought directly for Bitcoin with no 3rd party involvement and no counter-party risk; and trustless trading allows us to create a decentralized exchange: you do not care whom you trade with as long as price is good, so it is possible to trade in peer-to-peer fashion. More info in coindesk article.
History (of implementation): I've implemented first proof-of-concept a year ago, but it was very basic and didn't allow trustless trade. A more advanced version, one which supported trustless trade, appeared several months later. In January of 2013 we had a version of ArmoryX (based on Bitcoin Armory) which implemented p2ptrade: it allowed communication between peers via a chat server, so that they could trade when they agree on price. It had trading interface generally similar to what you see on, and worked more-or-less as advertised, but had several bugs and deficiencies. Particularly, Armory requires about 2 GB of RAM on mainnet, so it is quite a bit cumbersome. For this reason, we decided to focus on a web client (WebcoinX), which will be more accessible to users. Unfortunately, implementing a web client was hard, and also we had very few developers working on them (on average, less than one), so we got a very basic demo only in May.
Recent history: Unfortunately lead developer of WebcoinX disappeared soon after the first demo, so we had more delays... Until I restarted development in July. At that point I got a better theoretic model of colored coins, and some ideas for a better wallet architecture which would avoid possible corner cases and will be more convenient for users. So we started making so-called Next-generation colored coin client (NGCCC) from scratch, also we continued with WebcoinX, as I found more developers.
Current state: We were able to improve user interface of WebcoinX, so that it is better suited for colored coins, and it now works on testnet (note: it is work-in-progress, not all features work properly!), but we lack manpower to fix remaining problems and optimize it so it will work on the mainnet. It current pace it would take a lot of time until we get there. And we got a very basic proof-of-concept of NGCCC, but more work needs to be done to make it usable. Currently I'm working on it alone (when I'm not working on WebcoinX), so progress is slow.
Developers, developers, developers: If we'll be able get more developers working on this, we can make a complete colored coin client in matter of weeks. But I gotta clarify: we don't just need more developers, we need developers who understand general structure of a cryptocurrency wallet, and can make implementation decisions on their own. Otherwise, if I'll be responsible for all decisions, we'll be bottlenecked at decision-making.
Specifics: WebcoinX is written in JavaScript, NGCCC is written in Python. Both projects are 100% open source, but there are sponsors willing to fund development, so developers will receive bounties proportional to their contributions. Total compensation pool is 70 BTC for each project. Generally, each contributor will record how much time he spent on a particular task in a spreadsheet, and once specific milestone is complete, each will get bounty proportional to self-reported efforts. (But I reserve a right to make adjustments, in case someone will try to abuse the system.) Although if somebody if somebody bids to implement some component for a fixed sum, that might be a possibility too. I can even accept an offer to implement the whole thing for 70 BTC.
Importance: You might have heard news about and closing down. A year ago GLBSE was shut down abruptly. Things like this cause turmoil on markets, even if no Bitcoins were lost. (We are lucky that no exchange have lost coins so far, I guess such an event, which is very well possible, would have resulted in much bigger panic.) Decentralized exchanges are going to ameliorate this problem, as they will remove necessity for centralized exchanges. Even if trading relies on some chat server, it is possible to switch to a different server in matter of minutes, so disruption will be minimal. There are other possible benefits, but I'm not going to go in details here, I could write a few pages about it...
Competition: There are several project which aim to implement decentralized stock markets. Some of them: Freimarkets, Bitshares, Mastercoin. But only colored coins and Mastercoin promise to enable direct trade for Bitcoins. And colored coins is both more mature (we had a working proof-of-concept a while ago), and I believe it is better: it doesn't put any data into blockchain, and it makes trustless thin clients (i.e. similar to Electrum) possible.
Myths about colored coins: Pretty much each time I mention them, people say how it is a bad thing, so here's a list of common misconceptions:
  1. It is a bad thing because it bloats blockchain. The approach we use now puts no extra data into blockchain, it creates only ordinary Bitcoin transactions, which are not different from trade transactions. Vilifying colored coins for this reason makes as much sense as vilifying selling coffee for Bitcoins: if millions of cups of coffee will be sold each day, it will create a lot of blockchain bloat!
  2. You should use a separate blockchain because it rustles my jimmies. Well, we use specifically the Bitcoin blockchain for a good reason: it enables trading securities for Bitcoins directly, with no 3rd party involved, and securely. This is what we want and this is what people want.
  3. Namecoin was first! Yes, it was, so? Use of Namecoin names for trading has much higher overhead. It is outright impossible to create a private currency on top of it, for example.
  4. But, cross-chain trade script. It exists, but it isn't as good as atomic coin swapping which we use.
  5. Anti-dust patch killed colored coins. It made order-based coloring much less convenient, but we now use a more general theoretic model which allows us to use different 'color schemes'. Particularly, improved version of order-based coloring eliminates all inconveniences created by anti-dust patch.
  6. Mixing and diluting problems. This can happen only if there is a bug in colored coin wallet. But you can lose your money if there is a bug in a Bitcoin client, e.g. some people managed to pay as much as 200 BTC in fees because client software forgot to include change output. At least with colored coins, if shit happens you can try to get compensation from an issuer, as it is easy to prove that coins were destroyed. And if he is kind enough, he will just send you replacement. Also, we now use better wallet architecture which will prevent problems like this.
submitted by killerstorm to Bitcoin [link] [comments]

Is bitcoin armory infeasible to use due to excessive initialization times?

I'm using the linux version 0.96 and it is literally taking days to complete initialization on an i7 laptop with 8GB RAM and an SSD. It appears to be using only one of my 8 cpu cores, so I'm unsure if the problem can be solved with a change to the code to support multi-threaded execution. 'Parsing TX Hashes' and 'Build Databases' are the steps that take unfathomly long. My bitcoin qt synchronized ok. I'm concerned about using bitcoin armory in the long-term as the blockchain will only get longer. I don't want to need to wait a week for this kind of operation to complete. I note that my RAM usage is not particularly large at present, so it appears to be a CPU-intensive operation.
submitted by clickedbymistake to Bitcoin [link] [comments]

Paranoid bossy tyrants...

I am really sick of the bossy paranoid bitcoin tyrants telling us "Don't trust!" Bitcoin is designed to work in a trust free environment.. That does not mean that trust ought to be banned outright.. Freedom says we can trust who we want to trust and not trust those who we don't want to trust.
Coinbase and Circle aren't Mt Gox.. If grandma prefers trusting them with her bitcoin than building an dedicated offline linux armory wallet, then let her be free!
There seems to be a thought process in the Bitcoin faithful that advocates the most paranoid practice is the best practice.
FWIW, I buy my Bitcoin at Circle or Coinbase, Transfer it to a wallet I control, Keep my large amounts in paper wallets that I generated at an online version of If you had a ram scraper installed on my computer when I generated it, or have access to the memory in my laser printer, go ahead and take it.. It's my gift to you.. ;-)
The multiple confirmation thing is another paranoia that is ruining bitcoin. How many times has any transaction been confirmed once and later omitted permanently from the blockchain? Once? Yet the best practice is sold as 6 confirmations? Really?
End of rant..
submitted by fingertoe11 to Bitcoin [link] [comments]

Colored coin client preview #1 (based on Bitcoin Armory)

I think it's already good enough for people to play with it. (Although certainly it's not ready for anything serious.)
For people who are not familiar with concept, colored coins is a technology which allows people to represent arbitrary tokens (e.g. issue private currencies, stocks, bonds, etc.) using small quantities of bitcoins. It is interesting because it would allow us to create decentralized and secure markets. (As decentralized and secure as Bitcoin itself, at least in theory.) See here.
Notes about current release:
Windows binaries:
There are no Linux binaries, but it's really easy to build it on Ubuntu or Debian:
(Note: if you're already using Armory, it is a good idea to hide you ~/.armory so it won't be seen by this experimental Armory mod. Or, perhaps, just don't run this experimental mod.)
Before you run it, make sure that bitcoind or Bitcoin-Qt is running and fully sync'ed. Armory takes up to 10 minutes to start (this version is slower because it additionally scans for colored transactions) and requires ~ 1 GB of RAM.
At start it will offer to create a wallet, do not enable encryption, otherwise issuing colored coins won't work.
Send some bitcoins to this new wallet, 0.02 BTC is probably enough to issue some colored coins and to pay for tx fees.
There is a drop down to choose color. Balance is displayed for a currently chosen color (i.e. if you chose TESTcc it will show how many TESTcc units this wallet owns), and when you send coins you send coins of that color.
Initially 'uncolored' is selected, it means normal BTC. This drop down also has TESTcc ("test colored coins") and "All colors" (this mode is just for debugging, you cannot send coins in this mode).
Here's what you can do now:
  1. Ask somebody to send you TESTcc. (We want to make it automatic, Satoshi Dice style, but unfortunately that code isn't quite ready.)
  2. Find and install other color definitions.
  3. Issue your own colored coins and send them to somebody who wants them. (LOL.)
Let's start from option #3. There is 'Hallucinate' menu. (It is called 'hallucinate' because colors do not exist on blockchain level, it is a client-side convention.) Choose 'Issue colored coins'. Likely all you need to change is name, but you can tweak satoshi-per-unit and number of units if you want.
When you click Issue it will create a new transaction (using your uncolored BTC) and will create a color definition. Optionally it will also upload your color definition to color definition registry. (This registry runs on my server, it might be down.)
You should note ColorID, this is how other people can refer to these coins (name is ambiguous).
You can now choose this new color in drop down and it will show your balance. (E.g. 1000 units.)
Now you'll perhaps want to send these coins to somebody. That person would need to install your color definition first. If you send colored coins without warning they might be lost, i.e. mixed with uncolored ones. For same reason it makes no sense to send them to wallet which isn't color aware.
For example, you can post on some forum:
I've issued LOLwut coins (ColorID: 36738fe78a443656535503efb585fee140a37d54), each unit represents a bond with face value of 0.1 BTC payable by me, Trololo, via buy back. I promise to buy back all bonds in a month.
Now people who are interested in this LOLwut coin issue will copy ColorID, paste it into Hallucinate > Download color definition dialog, and if this color definition is published it will be downloaded and installed. Armory restart is required to complete installation.
After installation that person will be able to see these LOLwut coins.
Note that if you do not trust my registration server, you can publish color definition yourself: go to ~/.armory/colordefs, find 36738fe78a443656535503efb585fee140a37d54.colordef and upload it to your web server. Then you can give people URL like and they can download it by URL.
Or they can just obtain this file by any means and copy it to ~/.armory/colordefs directory. It is decentralized, nobody can prevent you from issuing colored coins.
I think that's all. There is also Hallucinate > Manage color definitions dialog, but I hope it's easy to figure out how it works.
We are working on improved version, particularly on p2p exchange feature.
I've set up an IRC channel for people to talk about trying out colored coins: #colored-coins-otc on Freenode.
submitted by killerstorm to Bitcoin [link] [comments]

what's the status of Armory?

So, I'm really impressed with the feature set of Armory. However, I guess I'm a little uncertain of its current developement status. Seems like Armory got a website update in mid 2013 with a few fancy images, and we haven't seem much real updates since. The founder has been on let's talk bitcoin in the fall, and I see has several speaking engagements. There isn't any real release change log information on the website, and the information on release information and downloads is not very well organized. There seems to be less information on the website than there was before the mid 2013 website update. There also isn't a very clear way from the website to get to the source code of the software. Because of all this, I've haven't updated to the latest beta version, because I'm basically afraid of it. Now however, I can't seem to really use the version of Armory I have installed at all though. It takes about 45 minutes to load Armory, and my computer has 12GB RAM and is a quad core CPU. Along with the extremely slow startup, armory will randomly crash when sitting idle in the background. Sometimes after 1 hour of run time, sometimes after 12 hours or runtime. This random crashing has always been an issue for me, even over a year ago when the blockchain was not so huge. bitcoin-qt and bitcoind run no problem on my machine.
Anyone know what is going on? I really like armory, but I'm currently afraid to update to the latest beta version if I don't even know what the changes are.
submitted by andyschroder to Bitcoin [link] [comments]

Another approach to cold storage (a better one!)

This message is particularly for madbitcoins who's recently, and laudably, started a campaign to get people to do proper cold storage.
I want to advocate the use of the 12 word mnemonic seed for a deterministic wallet system as has been in use with Electrum, Armory for a long time and is now in beta for Multibit.
In my opinion this is, practically speaking, by far the superior system to any system of backing up actual private keys (including the most normal 'paper wallet from printer' construction).
Here are the advantages:
I also think it's worth advocating a 3-level wallet system, for most people. There is (1) the wallet that was created on TAILS and never gets opened anywhere else (the cold wallet), (2) the wallet you create on your linux or Mac everyday machine - but here again the seed helps; I re-install and uninstall Electrum for every time I use this wallet, thus 99% of the time there is no Electrum 'evidence' on my machine for a casual hacker - the 'warm wallet', and (3) the 'hot wallet' - on a phone, or perhaps a account or similar.
3 wallets might seem like overkill but in my opinion it's far less hassle than constantly having to re-backup non-deterministic wallets (so dangerous!), and it gives you much stronger security features.
submitted by waxwing to worldcryptonetwork [link] [comments]

Want to buy a small amount, getting pretty frustrated.

I only want to by about $15 worth of BTC.
I'm verified at VoS but my only options there are a certified cheque sent by registered mail (aprox $8.00) or a wire transfer with a $15 fee.
I tried quickBT but have a Visa interact card with TD which isn't supported in the online interact payment. The only other bank accounts I have are with unsupported banks (by any service I've found).
Everyone local to me selling on localbitcoins is looking for a minimum of $100.
I'm waiting for verification with because it looks like they have an email interac option once the verification is complete. Hopefully this works out after verification.
I really want to get into this as I feel BTC is going places. But damn this has been frustrating. Thanks for the vent session.
Also, BTW if anyone knows, I've been trying to get an online armory install working. It's downloaded the whole blockchain (roughly 49gb, yikes) and is working on building the databases. Every time it gets to 98% with 1 minute left and hangs there for hours. I just upped the RAM to 6gb and am letting it try again so we'll see. Anyone have anything similar that they managed to get working?
Edit +16 hours later: I'm going to try again with Armory but this time I'm downloading and installing bitcoin-qt first, with the bootstrap.dat file pre-downloaded. I'll let bitcoin-qt scan and update the blockchain first and then I'll install armory after. I'm also NOT using a VM this time so we'll see what happens. I have an offline version setup and have the watching only wallet exported, but cant do anything with it until i get this rolling.
submitted by belligerent_coffee to BitcoinCA [link] [comments]

Full Node Survey Results (and how it relates to block size)

Two core developers mentioned disappointing full node count in arguments against increasing block size.
Pieter Wuille: Full node count is at its historically lowest value in years, and outsourcing of full validation keeps growing. source
Greg Maxwell: Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating; from a clean slate analysis of network health I think my conclusion would be to decrease the limit below the current 300k/txn/day level. ... The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. source
The problem is full node count is low for reasons other than data load, in my opinion. I think data load plays a part, but indirectly. I think users would like to help the network and run full nodes, but only if/when it becomes convenient enough. Right now it's not. If I'm right then we could see similar node count with 0.5 MB block size or 5MB block size. A recent survey may suggest I'm right.
I posted a Full Node poll on Bitcointalk:
So far there are 23 votes and the leading reason for not running a full node is "It's inconvenient". Some comments are also illuminating:
"I used to run a full node, but I was running it on my laptop that I use for everyday things so it became to inconvenient to keep it running. It made things sluggish. I'll boot it up ever week or so in order to sync the blockchain to my computer because I like being able to use bitcoin core for things, but I don't run a node anymore. So I selected 'it's inconvenient'."
"It just takes too long to sync, when I open my wallet I want to be able to spend asap, that and it doesn't work with my trezor as far as I am aware."
"I don't know how to configure it to help the network. I would like to highlight."
"I do run my PC 24/7 most of the time with or without running a node so electricity isn't really a problem for me to be honest but lately I have experienced a lot of issues with Bitcoin Core one of them is Database corrupted each time I reboot PC (even if I close it before restart) and it takes too much time ... I mean imagine each time I run it it gives database corrupted and it should Reindex once again the whole file and it takes up to two days for that , I should do this over and over again and when I asked around the forums people said it could be a Hardware or RAM problem so I gave up on it and Switched to Lightweight wallet."
"No incentive. Electricity cost."
Notice for all of these answers having a 0.5MB block size or 5MB block size likely wouldn't have changed anything. Now imagine this. You open the package your mailman left and find to your delight it's your plug and play Bitcoin Node. You plug it in, the light turns green, and you're set. It configures itself. From your computer web browser you can monitor and create transactions, which you shuttle by USB stick to your Node to sign ala Armory's security model.
Now you run a full node which helps the network and have incentive to do so (better security). The dedicated device doesn't get in the way of your normal computing activity, and it's always synced and ready to go, because that's all it does. It provides the convenience and immediacy of a web based wallet.
I think we can make improvement to full node count by addressing usability obstacles to running them, regardless of block size.
If you haven't taken part in that poll please take a moment to do so or comment below. Thanks!
submitted by acoindr to Bitcoin [link] [comments]

Initial Armory Sync taking over 5 days now?? I have BTC stuck in Purgatory. Advice??

Hi everyone! I'm reaching out to reddit on this one, as I am a noob at bitcoin, the blockchain, and apparently choosing the appropriate wallets for my little and insignificant BTC transactions.
To start, this is my first transaction ever with BTC. I purchased funds from coinbase and sent them over to an Electrum wallet I had set up through the clearnet. I started reading forums and apparently went with some bad advice as to setting up a secondary wallet through tor and transferring my coin to it as a means to further anonymize my bitcoin. It was then also recommended that I use Armory as they have a reputation for being secure.
So this I did. I set up a secondary Armory wallet over tor (which later found out isn't very secure at all, depending on who controls those exit nodes) and then foolishly initiated a transfer prior to even beginning the ledger download, ridiculously long initialization process I was soon to find out needed to be done/should have been done beforehand. I obviously am now eating crow for not being nearly as versed on or prepared for this transaction before jumping in, nose first.
Ok so now my questions are these... My OS (with good CPU, 4 gb RAM, excellent bandwidth) has been in the verification process for going on 5 days now and currently at 83% of the Initializing Bitcoin Engine phase. I've only been creeping at maybe a 3% gain per day. Is this bc I set up the wallet through tor? Is there any fix that I can use to speed this up? Also, will my coin even be available anymore and show up in my wallet if and when I finally do get back online??
Obviously I'm feeling most idiotic at the moment for this and I really appreciate ANY advice I can be given on the subject. I'm now aware that I should have never gone with such a heavy wallet for my small-time bitcoin to begin with. But I just don't know where to go from here. Thanks for reading guys...
submitted by Thundergun22 to BitcoinBeginners [link] [comments]

Why does Armory suck so bad...?

So ~ week ago, I bought 1 BTC on Coinbase, and then I transferred 0.01 BTC to a wallet I created using Armory as a test. Problem is Armory has been "scanning transaction history" for fucking like 20 hours, not to mention using all my memory and disk during that time. It seems to be stuck. So, if anyone can enlighten me, can I empty that wallet I created back to my Coinbase acct w/o waiting for that god-awful Armory software to finish scanning (or maybe I just have a real shitty computer? - 4gb ram, 3.2 GHz quad core).
And also, if Bitcoin Qt stores every transaction since the beginning of time, won't it eventually take up hundreds of Gigs, if not more?
Sorry, I'm a bit of a newbie, and Bitcoin does not seem very newbie- friendly / non-tech-savvy-friendly =[
submitted by qqmore14 to Bitcoin [link] [comments]

Asking questions

So I've decided to join the illustrious pursuit of Bitcoin mining. I've done a bit of research on the matter, and read over a few of the posts here. And I still have a lot of research to do yet. My first problem is that there's still a lot that I'm not sure about and I'm still new enough that I don't quite know the right questions to ask, or if the questions I'm trying to ask are even applicable. So, I wanted to outline my situation and some of my thoughts here and see if this great community could help direct my research.
To start off, I purchased my ASIC hardware today, and I'm awaiting delivery. I managed to get a total of 13.2GH/s for less than $300, which seemed like a good buy. It's more MH/s per $ than the $350 USB device that butterfly labs was offering. I know that's pretty weak, but it's where I've decided to start. It's good enough for getting my feet wet.
My first real question is regarding the network bandwidth requirements of an active mining setup. All of the online guides I have read have said absolutely nothing to this issue. Common sense would tell me that this is either a negligible concern or in some other way, so bluntly obvious that it doesn't bear mentioning. So at the risk of sounding dense, can anybody describe their situation in this area? I've got a 12Mb/s DSL connection, will that be a bottleneck issue or will it suffice?
As I've come to understand, my next steps are to setup a wallet for the currency, join a user pool and setup a mining client. And it's these steps where my questions begin.
The most popular choice among the webz for walltes was easily Bitcoin Armory, but the 6GB RAM requirement is a nonstarter for my current hardware setup. I understand that they are working to reduce that requirement and that, also, there are hardware wallet modules hitting the market soon. Until one of the two options becomes available, I'm looking at utilizing a third party, web based wallet. Any suggestions in that regard would be appreciated. But my first concern with a wallet is that I am very interested in mining cryptocurrencies other than Bitcoin. I haven't been able to get an answer as to whether or not I will need an individual wallet for each unique currency I mine or if wallets generally support more than one currency type.
Reading through a number of the beginner's guides found through google gave me the impression that installing a mining client and joining a mining pool were separate issues. But a thread I read here seemed to indicate that particular clients were tied to specific mining pools. Preferably, I'd prefer a client that would allow me to select a mining pool, as well as supporting mining for Bitcoin as well as multiple other cryptocurrencies. Is that a realistic expectation?
If the tone of any of my questions suggest any broad misconceptions, I would appreciate a heads up. If anybody can link to some helpful literature on the subject, that would also be appreciated.
Thanks for reading, and cheers for any help.
submitted by playedspades to BitcoinBeginners [link] [comments]

Argh. Have been using offline computer to sign transactions with a downloaded web utility. Firefox remembers my private keys...

Just a warning to those who have downloaded web utilities and have been running them on an offline computer. I was hoping that nothing ever gets stored on the computer, so that once I turn it off, there's no risk if the computer was stolen. Firefox was "storing sessions" or something like this.
Ugh. I should have known better. Haven't lost anything, but it sucks to know that your security wasn't nearly as good as you thought it was.
I really like all of the web-based utilities people have written in the Bitcoin community and would like to keep using them on an offline computer... if anyone has bullet-proof suggestions as to how to prevent the computer from storing anything (i.e., only keep things in RAM), please share! (maybe there is a stripped down browser I should use...?)
I know that using Armory would solve this and that many other offline transaction signing solutions exist.
submitted by chriswilmer to Bitcoin [link] [comments]

2FA v alternative wallets

My first post (i'm based in Europe) but I've been reading reddit for year plus but never had much of a reason to sign up and post something until now as I've put my money where my mouth is and bought some bitcoins. If I'd have been in at 3, 9 or 27+ Id still be holding most of them as I'm in it for the long haul and wouldn't sell even if it hits 5000 (I bought in at <200) as I can't help but be excited about where this could end up.
I'm a programmedeveloper and have known about bitcoins since March this year but I was turned off them completely when I read somewhere (probably on reddit) that a 1/3 of them are stolen. That person owes me!
I'd been looking for a company that went up as much as I/we all hope bitcoin does, in terms of investment and both Cisco and Microsoft seem to be prime examples of what we hope can be achieved. I'll be spending them where possible but feel this might be a few months away yet especially where I live.
In terms of my post, I'm all for recommending wallet to hold bitcoins with 2 form authentication (2FA) enabled so I require my identifier, my strong password and a unique pin (which is sent to my mobile each time) to log in. On top of that, a second password is required to withdraw or spend coin so I'm pretty satisfied it's secure.
As an added measure, I have enabled 2FA on my gmail account. I have small amount on my bitstamp account which also requires a email verification in order to move my bitcoins.
If I am able to move my blockchain wallet to a PC if anything were to ever happen the domain/website then why would I risk having multiple encrypted copies on a PC, paper wallet or whatever other alternatives exist. If i did switch Id probably end up using armory but can't seem to get it going as it takes forever and have read older version takes 8gig ram to run smoothly in windows but watch only wallet was very appealing. I'll may end up moving everything to armory on linux with no network connectivity depending on users feedback here (if any) but I'm quiet content at the moment to keep everything as is
submitted by fancyabit to Bitcoin [link] [comments]

Bitcoin Armory - Simulfunding a Lockbox Armory Tutorial Part 1 - YouTube Armory Companion Demo How To Build The Cheapest Mining Rig Possible! - YouTube Bitcoin Armory Troubleshooting Offline Node - YouTube

Bitcoin Armory has arguably the most comprehensive set of wallet functionalities in the market. The online wallet is HD and has a flexible multiple-signature feature, meaning armory is among the top online wallets out there. But armory is perhaps best known for it’s cold storage solution. Great online solution, great offline solution, one wallet. Indeed, Bitcoin Armory provides enterprise ... Armory directly reads from the block data files that Bitcoin Core and its forks produces. It also relies on the p2p network messages and the JSON-RPC server in order to communicate with Bitcoin Core. If the full node software does not use the same block data file format used by Bitcoin Core or does not support the same JSON-RPC functions, then it will not be compatible with Armory. Armory Wallet ist in der Tat eine Ergänzung zu Bitcoin Core, die es ermöglicht, das Sicherheitsniveau beim Betrieb virtueller Mittel zu erhöhen. Die Hauptvoraussetzung für die Installation der Wallet Armory ist das Vorhandensein einer weiteren Workstation, die über 512 Megabyte RAM und USB-Laufwerk zum Speichern von Informationen verfügt. Open Bitcoin Armory (testnet), navigate to the settings panel, and deselect “Let Armory run Bitcoin-Qt/bitcoind in the background.” Next, navigate to the location of your Bitcoin Core installation, for example C:\Program Files\Bitcoin. Once there, it is easiest to click on the “bitcoin-qt” application and create a shortcut. Next, you ... Armory is used by some of the most heavily-invested, and most paranoid Bitcoin enthusiasts and cloud miners for maximum privacy and security. If you are in this category, it is recommended you verify that your Armory installers have not been altered in any way. Armory Ubuntu/Debian packages (*.deb files) are signed directly using our [Offline ...

[index] [38664] [31377] [3888] [3511] [44067] [1423] [831] [36881] [6604] [19212]

Bitcoin Armory - Simulfunding a Lockbox

I made an app and some python scripts that allow one to use Armory offline without moving transaction content back and forth with USB keys. This makes it so you can have an offline wallet in ... How to install Armory 0.93 and Bitcoin Core 0.10 on Windows 7, and how to create your first wallet. A guide for beginners. Enable subtitles in the video. If you like the guide, feel free to send a ... Bitcoin Armory - Spending from a Lockbox - Duration: 4:56. Andy Ofiesh 149 views. 4:56. How to create and use Multi Sig Bitcoin Wallets - Duration: 10:23. Hodl Hodl 6,541 views. 10:23 . How to ... Most people view building a mining rig as an expensive or confusing thing to do. However, we break down what exactly you need for your mining rig & how to do... Bitcoin Armory - Simulfunding a Lockbox