Upload File to Google Drive using Python

This blog article shows you how to upload file to Google Drive using Python.

To upload a file to Google Drive using Python, you’ll need to use the Google Drive API. Here’s a step-by-step guide on how to achieve this:

1. Set up a Google Cloud Project and enable the Google Drive API:

– Go to the [Google Cloud Console](https://console.cloud.google.com/).

– Create a new project (or select an existing one).

– Enable the Google Drive API for your project.

– Create OAuth 2.0 credentials for a Desktop application.

– Download the `credentials.json` file.

2. Install required libraries:

You will need the `google-auth`, `google-auth-oauthlib`, `google-auth-httplib2`, and `google-api-python-client` libraries. You can install them using pip:

pip install google-auth google-auth-oauthlib google-auth-httplib2 google-api-python-client

3. Write the Python script to upload the file:

Below is a sample Python script that handles the authentication process and uploads a file to Google Drive.

from __future__ import print_function
import os
import pickle
import io
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.http import MediaFileUpload
# If modifying these SCOPES, delete the file token.pickle.
SCOPES = [‘
https://www.googleapis.com/auth/drive.file’]

def main():
     “””Shows basic usage of the Drive v3 API.
     Uploads a file to Google Drive.
     “””
     creds = None
     # The file token.pickle stores the user’s access and refresh tokens, and is
     # created automatically when the authorization flow completes for the first
     # time.
     if os.path.exists(‘token.pickle’):
         with open(‘token.pickle’, ‘rb’) as token:
             creds = pickle.load(token)
     # If there are no (valid) credentials available, let the user log in.
     if not creds or not creds.valid:
         if creds and creds.expired and creds.refresh_token:
             creds.refresh(Request())
         else:
             flow = InstalledAppFlow.from_client_secrets_file(
                 ‘credentials.json’, SCOPES)
             creds = flow.run_local_server(port=0)
         # Save the credentials for the next run
         with open(‘token.pickle’, ‘wb’) as token:
             pickle.dump(creds, token)
     service = build(‘drive’, ‘v3’, credentials=creds)
     # File you want to upload
     file_path = ‘your_file.txt’  # Update this with the path to your file
     file_metadata = {‘name’: os.path.basename(file_path)}
     media = MediaFileUpload(file_path, mimetype=’text/plain’)
     # Upload the file
     file = service.files().create(body=file_metadata,
                                   media_body=media,
                                   fields=’id’).execute()
     print(‘File ID: %s’ % file.get(‘id’))
if __name__ == ‘__main__’:
     main()

clip_image002

Explanation:

Also: Where to get the ClientId and ClientSecret for Google API

1. Import necessary libraries: The script imports the required libraries for handling authentication and making requests to the Google Drive API.

2. Define SCOPES: Scopes determine the level of access you need. `’https://www.googleapis.com/auth/drive.file’` allows read and write access to the files created or opened by the app.

3. Handle credentials: The script checks if there is an existing token (`token.pickle`). If not, it goes through the OAuth 2.0 flow to generate one.

4. Build the Drive API service: Using the credentials, it creates a service object for the Drive API.

5. File metadata and upload: The script sets up the file metadata (like the file name) and uploads the file using `MediaFileUpload`.

Running the Script:

– Make sure your `credentials.json` file is in the same directory as your script.

– Replace `’your_file.txt’` with the path to the file you want to upload.

– Run the script using Python.

This will upload the specified file to your Google Drive and print the file ID upon successful upload.

Source code download: https://github.com/chanmmn/python/tree/main/2024/UploadFileGoogleDrive?WT.mc_id=DP-MVP-36769

Reference: https://developers.google.com/docs/api/quickstart/python?WT.mc_id=DP-MVP-36769

https://github.com/googleapis/google-api-python-client?WT.mc_id=DP-MVP-36769

Posted in Uncategorized | Leave a comment

Permissions for ‘key.pem’ are too open When accessing Azure Linux

This blog article shows you one of the possible ways to eliminate the error message “Permissions for ‘key.pem’ are too open”. In a Azure VM installed with Ubuntu, you can get the pem key for SSH to login by using a screen as below.

{B0BA3848-5447-4CD3-9AE1-C9B35ECB1DA7}

Once you get the key file with extension .pem then you can issue the following command.

ssh -i key.pem azureuser@22.199.155.112

Somehow you hit an error.

{ACF8F417-5D79-48B5-A6A8-8DB0B3D7E4D8}

To get rid of the error, go to the Properties of the file. Remove all the users under the security tab but yourself.

{3F896BFD-3A63-40F5-A65D-852329D0E928}

You can access to the Linux machine if you rerun the SSH command.

{D7E79750-0DDA-4536-92C1-DAC2A723230C}

Reference: https://superuser.com/questions/1296024/windows-ssh-permissions-for-private-key-are-too-open?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, Operating System, Servers | Tagged | Leave a comment

Debug T-SQL using Visual Studio

This blog article shows you how to how to debug T-SQL statements using Visual Studio. In the blog article Querying All Tables in a Database Using Dynamic SQL in Microsoft SQL Server we have the SQL code as follows.

{08CD4ABC-742F-4471-92ED-50E88FADB0B8}

Start Visual Studio, go to Tools then select SQL Server -> New Query…

{27837AD5-8623-4E6C-A8FC-3D5D8AA89260}

Login to SQL Server. Click Connect.

{F7E76A78-5C04-4100-857C-1A1AAE7696BE}

Paste the SQL code into the Visual Studio.

{C20D7994-8B3D-43F9-8976-69761015C595}

Select Execute with Debugger.

{C39F1971-6A82-4026-BE14-CA2D3E97773A}

You can step through the T-SQL code now.

{9EAA40E8-AFA6-4D47-B64A-0B64A6CF8237}

Video: https://youtu.be/k2mRuFMC2Wc?WT.mc_id=DP-MVP-36769

Reference: https://learn.microsoft.com/en-us/sql/ssms/scripting/step-through-transact-sql-code?view=sql-server-ver16?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, Data Platform, programming | Tagged | Leave a comment

Add GitHub Copilot to Visual Studio

This blog article shows you how to add GitHub Copilot Completion and GitHub Copilot Chat to Visual Studio 2022. This article expects you to know how to create a Console App in Visual Studio 2022. The figure below is before add in the GitHub Copilot.

{BD75E889-7E84-47FA-8926-24A11DC5ADEA}

To add GitHub Copilot, click on Extensions, then select Manage Extensions…

{B3541345-4364-4828-BAB0-90457374F4F7}

In the Search box type github copilot. Click on Download for GitHub Copilot Completions.

{08DB972D-347A-431B-BCCD-C5C5EF12C0A3}

You will then see the alert as follows:

{AA856D55-1FF4-4786-A574-7D88CBA22B42}

Save your work and close Visual Studio then the installation of the GitHub Copilot Completions will begin.

{E5EA15C3-09D5-4C75-971D-3D16AC7ADD37}

Click Close when the installation is completed.

Also: Generate Code using GitHub Copilot, Gemini, ChatGPT, and Bing Chat

{35F5FE0F-8900-4B61-B22C-E89D959D6402}

When you reopen your project, you need to click on Add GitHub Account. Visual Studio will ask you for sign in. Type in your user name and password.

{D15DF6D6-1D21-4C4F-BCC9-E9404499E271}

Click on Re-enter your credentials for GitHub.

{12BFE11A-9460-4403-AF04-6CFCFA7EFD33}

Type in your username and password.

{0E137D60-4730-4A65-82CC-F9E340DC1427}

You will see the Success screen.

{9F1934AF-2CD2-4CBC-AA14-50153F36239F}

Now you can see the auto completion is with GitHub Copilot.

{B918DE48-125A-4B75-977B-AC23B029362F}

Add GitHub Copilot Chat from Manage Extensions

{F65BF68C-FBE5-49FC-8D2A-9AE85640C637}

Re-open the project. Right Click on the code page and select Add Copilot.

{CE13CC28-1803-4E8F-BC01-9AD62F7D3815}

You can now ask Copilot to write some code for your project.

Note, GitHub Copilot is a pay service.

Reference: https://learn.microsoft.com/en-us/visualstudio/ide/visual-studio-github-copilot-chat?view=vs-2022?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, programming | Tagged | Leave a comment

CTO May 2024 articles and resources

These are the good reads found in this month.

Is Kubernetes worth it?

· What to consider

· The importance of a TCO analysis…

clip_image002

https://www.infoworld.com/article/3714768/is-kubernetes-worth-the-price-tag.html

How to set – and achieve – DEI goals in IT

· Take stock

· Get specific

· Measure progress and be prepared to adjust tactics…

https://www.computerworld.com/article/2095849/set-and-achieve-dei-goals-in-technology-workforce-diversity-equity-inclusion.html

Why companies are leaving the cloud

· Cost, not cloud

· Don’t feel sorry for the public cloud providers…

https://www.infoworld.com/article/3712861/why-companies-are-leaving-the-cloud.html

3 Signs It’s Time to Move off the Cloud

· Timing

· Cost

· The Cost of Not Moving…

https://thenewstack.io/three-signs-its-time-to-move-off-the-cloud/

The cloud is benefiting IT, but not business

· Who benefits from the cloud (besides vendors)?

· ROI of cloud computing and AI

https://www.infoworld.com/article/3715300/the-cloud-is-benefiting-it-but-not-business.html

8 Essential Qualities of Successful Leaders

1. Authenticity

2. Curiosity

3. Analytical prowess…

https://hbr.org/2023/12/8-essential-qualities-of-successful-leaders?ab=HP-hero-for-you-text-1

Resource:

Case Studies in Cloud Modernisation

https://chanmingman.wordpress.com/2023/02/27/case-studies-in-cloud-modernisation/?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, CTO, Data Platform | Tagged | Leave a comment

Querying All Tables in a Database Using Dynamic SQL in Microsoft SQL Server

Are you dealing with a database where you need to query every table, but you’re not keen on writing out individual SELECT statements for each one? Microsoft SQL Server offers a solution through dynamic SQL and cursors, enabling you to automate the process efficiently. Let’s delve into how you can achieve this.

Dynamic SQL

Dynamic SQL refers to generating SQL statements dynamically at runtime. It allows you to construct SQL statements based on variables or conditions, offering flexibility in query generation.

Cursors

Cursors in SQL Server are database objects used to retrieve data row by row, which is particularly useful when you need to iterate through a result set or perform operations on each row individually.

Combining Dynamic SQL and Cursors

We can harness the power of dynamic SQL and cursors to iterate through all tables in a database and execute a SELECT statement for each table.

Here’s how you can do it in Microsoft SQL Server:

DECLARE @TableName NVARCHAR(255)
DECLARE @SQL NVARCHAR(MAX)
— Declare a cursor to loop through all table names
DECLARE TableCursor CURSOR FOR
SELECT name
FROM sys.tables
— Open the cursor
OPEN TableCursor
— Fetch the first table name from the cursor
FETCH NEXT FROM TableCursor INTO @TableName
— Start looping through tables
WHILE @@FETCH_STATUS = 0
BEGIN
     — Build the dynamic SQL statement
     SET @SQL = ‘SELECT * FROM ‘ + QUOTENAME(@TableName)   
     — Execute the dynamic SQL
     EXEC sp_executesql @SQL   
     — Fetch the next table name
     FETCH NEXT FROM TableCursor INTO @TableName
END
— Close and deallocate the cursor
CLOSE TableCursor
DEALLOCATE TableCursor

Also: Find the Common Number for the Numerator and the Denominator T-SQL

clip_image002

Explanation

– We declare a cursor (`TableCursor`) to loop through all table names retrieved from the `sys.tables` system catalog view.

– Inside the loop, we construct a dynamic SQL statement to select all columns from the current table.

– The dynamic SQL statement is executed using `sp_executesql`.

– The cursor continues fetching the next table name until all tables have been processed.

By leveraging dynamic SQL and cursors in Microsoft SQL Server, you can streamline the process of querying multiple tables within a database. This approach not only saves time and effort but also enhances the automation capabilities of your database queries. So next time you find yourself needing to query every table in a SQL Server database, remember this efficient method at your disposal.

Source code download: https://github.com/chanmmn/database/tree/main/2024/script/LoopAllDatabaseTables?WT.mc_id=DP-MVP-36769

Reference: https://learn.microsoft.com/en-us/sql/t-sql/language-elements/declare-cursor-transact-sql?view=sql-server-ver16?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, Data Platform, programming | Tagged | 1 Comment

Azure Storage download files using PowerShell

This blog article shows you how to download the file from Azure Storage folder using PowerShell. Below is the code snippet to download the files from Azure Storage Container.

Connect-AzAccount -Tenant TanantId -Subscription SubscriptionId
$resourceGroupName=”resourcename” 
$storageAccName=”accname” 
$directoryPath=”document”
$container = “container”
Function Getfiles
{
     $ctx=(Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccName).Context 
     $files = Get-AzStorageBlob -Container “document” -Context $ctx  | Where-Object {$_.LastModified -gt “2023-12-04” -and $_.LastModified -lt “2023-12-05”}
     foreach ($file in $files) 
         { 
             write-host -ForegroundColor Yellow $file.Name
             Get-AzStorageBlobContent -Container “document” -Context $ctx  -Blob $file.Name -Destination “C:\test\test”
         }
}
Getfiles

Also: Azure Storage Filter Last Modified Date

clip_image002

Source code download: https://github.com/chanmmn/powershell/tree/main/2024/AzureStorageGetFile/?WT.mc_id=DP-MVP-36769

Reference: https://sqlsrv4living.blogspot.com/2020/02/azure-blob-storage-list-and-download.html?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, programming | Tagged | Leave a comment

Azure Migrate and Modernize

Create highly differentiated digital experiences and deliver continuous innovation with Azure Innovate, a new offering to help you plan and deploy Microsoft solutions across apps, data and AI to build intelligent applications.

Get started with the Azure Migrate and Modernize + Azure Innovate Resource Kit to explore cloud migration and modernisation scenarios and find resources to help you build new apps in the cloud.

Use the resource kit to:

  • Get expert guidance to help define your business plan and create a business case for developing AI apps.
  • Explore resources and help to get started with solutions like Azure Kubernetes Service (AKS), Azure Cosmos DB and Azure AI services.
  • Learn about the cost benefits of developing intelligent apps, and how to drive business performance and growth.

This resource kit includes comprehensive advice and resources to help you adopt Azure across all stages of your cloud innovation journey.

image

This blog is originated form here (https://chanmingman.wordpress.com).
The content is extracted from a Microsoft email.
(Read it now, I am not sure how long Microsoft will leave the link alive).

Read: https://bit.ly/4ag8pqd

Posted in .Net, Cloud, Community, Computers and Internet, CTO, Data Platform | Tagged | Leave a comment

Six Strategies for AI Implementation

Six Strategies for AI Implementation: Responsible AI practices on Azure

Position your org for long-term success by responsibly implementing and scaling AI using Azure. Ensure compliance, mitigate risks,and build trust using the strategies for responsible and ethical AI in this eBook.

Get Six Strategies for AI Implementation: How to apply responsible AI practices on Azure to learn about:

  • The importance of having a responsible AI framework in place.
  • Adhering to AI best practices using Microsoft’s Responsible AI scorecard.
  • The benefits of scaling up AI adoption at your organisation.
  • Enhancing customer experience using AI.

image

This blog is originated form here (https://chanmingman.wordpress.com).
The content is extracted from a Microsoft email.
(Read it now, I am not sure how long Microsoft will leave the link alive).

Read: https://bit.ly/3WEO6jf

Posted in .Net, Cloud, Community, Computers and Internet, CTO, Data Platform | Tagged | Leave a comment

JavaScript Calls Web Api

This blog article shows you how to call a C# Web Api using JavaScript. For JavaScript to call the Web Api, you might need to add the CORS allows code in Program.cs in your Web Api like below.

app.UseCors(builder => builder.AllowAnyHeader().AllowAnyMethod().WithOrigins(“https://localhost:7046”));

clip_image002

With the JavaScript below, you can call the Web Api at port 7271 from the link https://localhost:7046.

<script>

// Define the API URL

const apiUrl = ‘https://localhost:7271/WeatherForecast&#8217;;

// Make a GET request

fetch(apiUrl)

.then(response => {

if (!response.ok) {

     throw new Error(‘Network response was not ok’);

}

  return response.json();

})

.then(data => {

// Work with the JSON data here

const weatherDataElement = document.getElementById(‘weatherData’);

weatherDataElement.textContent = JSON.stringify(data, null, 2);

console.log(data);

})

.catch(error => {

console.error(‘Error:’, error);

});

</script>

clip_image004

You can download the code and try it out.

Source code download: https://github.com/chanmmn/WebApi/tree/master/2024/JavaScriptCallWebApiCS/WebAppEmpty?WT.mc_id=DP-MVP-36769

Reference: https://learn.microsoft.com/en-us/aspnet/core/tutorials/web-api-javascript?view=aspnetcore-8.0?WT.mc_id=DP-MVP-36769

Posted in .Net, Cloud, Community, Computers and Internet, programming | Tagged | Leave a comment