Have you ever drifted into a situation where you couldn’t get your repo up to date and keep getting error after error? Well this happens to me a few times a year and it’s usually when I forget to create a .gitignore in my terraform project folder. The issue most commonly points to files over 100MB not being able to be synced to GitHub. When I first came across this issue, I said “Ok I will just go delete the large files and re push…”. Unfortunately, it was not that easy. Git log was still yelling at me saying some files were too large! But wait, I removed them??? While the answer is yes to this, the commit was already issued for that file so a “Git commit” does see the file as being deleted due to the commit cache. How do we fix this? It wasn’t easy the first time around, but after digging and digging I was able to figure it out.

The issue was that my local repo was committed 2 times and was never pushed to the remote repo so therefore things are out of sync. Now the key is you probably do not want to lose your local changes, so therefore you must be careful.

Let’s take a look at the original output from the Git log:

remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
remote: error: Trace: d8da408a23f1c1cafca7b55264cf0823eb3ab1259b0a80202549a51ea2dc0718 remote: error: See http://git.io/iEPt8g for more information. remote: error: File AWS-Lab/Project_01_vpc-security/terraform/modules/.terraform/providers/registry.terraform.io/hashicorp/aws/3.24.1/darwin_amd64/terraform-provider-aws_v3.24.1_x5 is 194.74 MB; this exceeds GitHub's file size limit of 100.00 MB To https://github.com/***********/Terraform-Master-Directory.git ! [remote rejected] master -> master (pre-receive hook declined)error: failed to push some refs to 'https://github.com/***********/Terraform-Master-Directory.git'

When you do a “terraform init”, terraform downloads any modules, or plugins it needs to work within that directory. As you can see the terraform aws provider plugin is well beyond 100MB and will not push to GitHub due to size restrictions. This is where the .gitignore comes in handy to be able to ignore all these types of files.

The file usually looks something like this:

# Local .terraform directories
**/.terraform/*

# .tfstate files
*.tfstate*.tfstate.*

# Crash log
filescrash.log

# Ignore any .tfvars files that are generated automatically for each Terraform run. Most
# .tfvars files are managed as part of configuration and so should be included in
# version control.## example.tfvars

# Ignore override files as they are usually used to override resources locally and so
# are not checked in
override.tf
override.tf.json
*_override.tf
*_override.tf.json

# Include override files you do wish to add to version control using negated pattern## !example_override.tf
# Include tfplan files to ignore the plan output of command: terraform plan -out=tfplan# example: *tfplan*
=======
.history
**/.terraform/
**.DS_Store

# .tfstate files
*.tfstate
*.tfstate.*

my-todo.md

Within this file you can define what things you would like ignore when performing a git commit and git push. All downloaded modules and plugins are located in the hidden .terraform folder. As you can see in the .gitignore is set to ignore **/.terraform/*, which means any directory and then anything in the .terraform folder. But because I did a commit before I deleted the large file and added a .gitignore, the commit and push process will always think the large file is there. Let’s fix this!

First thing is first, let’s see what the status of Git is:

swilliams@Stevens-MacBook-Pro Terraform-Master-Directory % git status
On branch master
Your branch is ahead of 'origin/master' by 2 commits.
(use "git push" to publish your local commits)

My local repo commits are two ahead of my remote repos per this status of Git. One of those commits caching the remembrance of the large file that I had deleted, but post commit.

Let’s reset this count and get it back to zero so that we 1. don’t lose our local changes to the code and 2. we can get the code pushed to GitHub after taking care of the pesky large files.

We achieve this by running this command: git reset HEAD~(number of commits you want remove). In our case it’s two commits to go back before the cache remembered the large file.

swilliams@Stevens-MacBook-Pro Terraform-Master-Directory % git reset HEAD~2
Unstaged changes after reset:
M .DS_Store
M AWS-Lab/.DS_Store
D AWS-Lab/Project_01_vpc-security/main.tf
M AWS-Lab/Project_01_vpc-security/variable.tf
M AWS-Lab/Project_03_EC2_VPC_Learning/.DS_Store
M Hashicorp-Certified-Files/.DS_Store

Now that we have done that, let’s take a look what the output of “git status” is now.


swilliams@Stevens-MacBook-Pro Terraform-Master-Directory % git status
On branch master
Your branch is up to date with 'origin/master'.

We are back at zero essentially. Now my local and remote repos are in sync and do not detect any issues. Now we can commit again (obviously deleting the large file or placing a .gitignore file within the directory) and push to GitHub with no issues or challenges. I believe there is other ways to achieve getting your code in sync with your remote repo, but this method has worked the best for me and really is quite easy.

Please follow and like us: