Tuesday 29 March 2022

Vue.js not building correctly in 'dist' folder 'npm run build'

 'npm run build' 'dist' folder not working.


I'd been hitting a bit of a brick wall with this task, so thought it'd be a good idea to commit it to the blog. Incase someone else finds it an can save themselves a heap of time. The task I was trying to achieve was

'How to host a Vue.js project on Firebase'  .  

And I was following this blog and youtube video for guidance. 

How to deploy vue.js applications with firebase hosting

And Net Ninja's 

Firebase Hosting Tutorial #4 - Deploying a Vue Site

But on a site ( that was already built and working locally ) I could only see the Firebase hosting holding message . 

'Firebase Hosting Setup Complete' 



I could however could create a new Vue.js website ( boiler plate ) and work from that, but building up from scratch again would be excruciating. 


Let's take a look at process I was following and where it was getting stuck. 

$. firebase init

Scroll down to the first 'Hosting config' choice and click on 'spacebar' and then 'return'. 


There is already a public folder in use, so we'll want to set the public directory as 'dist' 

As we're using Vue.js which is a javascript app that rewrites all the pages through the index page, then we need to select 'y' in response to , r
ewrite all urls to /index.html . This creates a dummy index page.


We don’t want to deploy this , we now want to build our vue application so that all the files go inside this ‘dist’ folder and then we want to deploy it.


1 npm run build


And we should now be able to view our site with.

1firebase serve

 

And this gives us the same ‘Firebase Hosting’ message we’re seeing before :confused:



On my basic vue.js install though, I could see that it was working fine. One thing that caught my eye was that the 'dist' folder was still empty. Hold on ! No, the issue was another 'dist' folder had been created and I had one in the 'root' of the project and one in the 'app' folder.

  1. I removed both the ‘dist’ folders and started again.

  2. Ran ‘fiebase init’ and chose the hosting as before.

  3. The ‘dist’ folder was created in Root . But I manually moved it to the ‘app’ folder

  4. I then ran ‘npm run build’ . this created the new files.

  5. In ‘firebase.json’ I had to add the following code

    1 "hosting": { 2 "public": "app/dist",

     

  6. Ran ‘firebase serve’.


    So I have the site showing using ‘firebase serve’ . :slight_smile:


  1. firebase deploy --only hosting

And I can now see it on the app online :) I hope this saves someone a heap of time, as it took me a while to work this one out :/













CodeRed CMS - How to Order ( weight ) the Navigation snippets

 The task was simple, I'd added a load of menu items to the top menu using snippets but I wanted to change the order 

What no drag and drop !  And neither could I see an input box to enter a position.   Surely I wouldn't have to get my hands dirty and write some code for this. 

Here's an easy fix though.  

As the navigation uses flexbox .  All you need to do is call the Bootstrap Order class.

And then in your 'navigation' item 'snippet' just add the ordering like this










Wednesday 16 March 2022

Gsutil - How to empty a GCS bucket

 Although I  couldn't find a 'gsutil' command to empty a bucket as such the following will remove and then we add the bucket. 

I've also add the code here for how to do this from an Airflow DAG . 


% gsutil -m rm -r gs://djem_photos

% gsutil -m mkdir gs://djem_photos


from airflow.operators import bash_operator bucket = 'gs://temp_gp_tasks'

with models.DAG(
'remove_files_from_gcs',
schedule_interval=None,
default_args=default_dag_args,
catchup=True) as dag:

remove_gcs_bucket = bash_operator.BashOperator(
task_id='remove_gcs_bucket',
bash_command='gsutil -m rm -r {bucket}'.format(
bucket=bucket))

recreate_gcs_bucket = bash_operator.BashOperator(
task_id='recreate_gcs_bucket',
bash_command='gsutil -m mkdir {bucket}'.format(
bucket=bucket))

remove_gcs_bucket >> recreate_gcs_bucket

Monday 7 March 2022

Airflow Connection Code Example - Access connection data from code

 Here's a straight forward piece of code, but I found this solution difficult to find and uncover .  So here's the solution to getting the data from the Airflow connection in to my DAG - in Google Cloud Platform 


from airflow.hooks.base_hook import BaseHook

conn = BaseHook.get_connection('my_conn')

def connect_to_somewhere():
logging.info(conn.login)
logging.info(conn.password)



Hope that helps someone out there ! 

Thursday 3 March 2022

GCP DAGS " Given * file, /home/airflow/dags/* , could not be opened.

Although this post is about solving my specific issue with locating the 'googleads.yaml' and it's 'private_key' file , it's will also be useful to others who want to Access any other file that they have uploaded. 

I had the following code that I could not get to work.  


FOLDER_PATH = '~/dags/'
2 3def gam_traffic(): 4 client = ad_manager.AdManagerClient.LoadFromStorage(FOLDER_PATH + 'googleads.yaml')

  I'd tried other FOLDER names , like the gs:// bucket address and the full address like

europe-west1-composer-XXX-XXXX-bucket.  All to no avail. 


RED HERRING ALERT

Using the gs:// bucket address worked fine for Saving and Retrieving .csv files using Python Panda, but I couldn’t get the address to work when using the Python Open command. My assumption now is that the Python Panda library must contain some ‘magic’ that process this address when it see that the URI starts with ‘gs://’

To solve my issue and find the path I needed then I ran the following tasks to prove what folder we are in , and all the files and folders in it.


import os
2 3# Print working dir 4def print_working_dir(): 5 directory = os.getcwd() 6 # Iterating through the json 7 logging.info(directory) 8 return "success"

 

1import glob 2 3# Print working dir 4def list_working_dir(): 5 for filename in glob.iglob("./**/*", recursive=True): 6 logging.info(filename) 7



The second task takes ages to run, as its iterating through loads of folders. In our case we intercepted the logs while it was running , as we could already see the information that we needed.

In our case the filepath we needed was

1./gcsfuse/dags/googleads.yaml'