Posts Tagged: ‘#HCLSoftware’

Introducing HCLSoftware U

17. April 2023 Posted by Tony Arnold

Introducing HCLSoftware U

We are very excited to launch our first phase of the HCLSoftware U training cloud. HCLSoftware U is the new brand and training cloud and for all HCL Software solutions for our customers, partners and internal training. We currently offer hundreds of general training courses plus Domino, BigFix, and VoltMX certification opportunities with more to come.

Training has changed radically in just the last ten years including a move from multi-hour lecture formats into small micro-courses which can be completed during a coffee break. Even in-person instructor-led training has morphed from multi-day lectures into more practical hands-on active learning classrooms where teams collaborate to complete a working lab exercise.

General training, how-to, and even sales training has become a mobile-enabled learn-on-the-go format. Though our proctored third-party certification exams are computer based, all of our other non-certification courses are accessible via typical desktop and mobile devices and optimized for Chrome browsers. We are working hard to listen to customer feedback and continually improve and streamline the overall HCLSoftware U learning experience, so do expect the training cloud to update as we move though this journey with you.

Sign-in/Sign-up and check out the new HCLSoftware U training cloud.

The new HCLSoftware U Brand
HCLSoftware has historically supported our solution though many different training options. In order to simplify and improve the user experience we have consolidated the following HCL Software training brands and content into HCLSoftware U as a single training cloud platform:

  • iAscend
  • The Academy
  • HCL Software Academy

Current Training & Future Plans
HCLSoftware has taken a big step from individual instructor-led training for customers and partners into hybrid instructor-led and virtual online training. This strategy delivers many advantages to our customers, partners, and internal teams. Advantages include potentially lower training cost, simpler access to training, higher skilled/trained solutions teams, access to a larger set of training assets, improved self-service, more ROI though deeper solution understanding, and allowing the instructor-led training to focus more on business use cases versus basic level training. We are launching the new HCLSoftware U with an impressive number of online virtual courses and will be offering many more courses to come, plus other training options for those that want a path to become HCLSoftware solution certified.

Creating a wholistic and comprehensive Learning & Development training cloud for HCLSoftware solutions is a long-term journey with the future goals of offering Level 1 and Level 2 Sales Accreditation, Technical Admin and Developer Certification programs for all HCLSoftware solutions. Since October of 2022, in just six months we have delivered over 100+ external courses, hundreds of Internal/Partner Training courses, plus four certification programs, and already have thousands of internal and external users on HCLSoftware U… and we are adding more each week.

Accessing HCLSoftware U
Customers – Customer can easily register and access HCLSoftware U via the standard signup process and will have access to all customer facing training and certification programs.

Partners – All partners do have access to HCLSoftware U via the standard signup process, however the approval process for access to gated partner only seller content may require a 48hr approval process in some cases.

Sign-in/Sign-up and check out the new HCLSoftware U training cloud.

Our Partner Philosophy Is a Key Part of Our Roadmap to the Future

23. Februar 2023 Posted by Rajiv Shesh

Our Partner Philosophy Is a Key

When you put our gigantic global footprint of business-critical products and services alongside our fierce commitment to product innovation, it’s tempting to say there’s no company on Earth like HCLSoftware. Without a doubt, the essential part we play in the world economy — and our 360-degree awareness of how our technologies are engineered and used worldwide — give us a unique perspective to match our unique structure and role.

But in my view, there’s another factor whose impact on our business may be just as great — our partner philosophy, which shapes our strategy, helps to define our day-to-day operations, and provides a key element of our roadmap to the future.

We recently gave our partners an updated view of our approach to partnership at the HCLSoftware Business Partner Executive Summit, held in Barcelona last November. Without trying to recap everything shared at that remarkable event (which would require a documentary, not a blog post), I’d like to zoom in on a few aspects of that approach.

A key takeaway from that summit was HCLSoftware’s emphasis on achieving scale and growth together with our partners — using “together” in the strongest possible sense. The new partner program we presented in Barcelona, the engagement platform we’ve rolled out, and our 50/50 co-marketing approach all reflect this — as does the partner philosophy that underlies them.

Our Partner Philosophy Is a Key

There are two key aspects of our partner philosophy that define our approach.

The first sounds straightforward, but in fact goes much deeper than it might first appear: We empower our partners to create value for their customers. Briefly put, that means giving them the right positioning, the right training, the right pricing, and the knowledge and readiness required to interact with their customers effectively and at high velocity.

It also means structuring our teams to give them maximum incentive to help each partner succeed — including think-outside-the-box measures like evaluating our own sales teams based in part on the partner teams’ success. Our teams are fully invested in each partner’s unique concerns, customer experience, and sales outcomes, and they go the extra mile to deliver on that investment.

The second is something that’s sometimes called “subservient leadership” — and it’s essential to grasping our “Partners First” mantra. In a nutshell, it means leadership through listening, trust, and empathy — and using the deeper understanding that results to remove obstacles, provide resources, and open doors for our partners. Mind you, this doesn’t mean abdicating true leadership or relinquishing the power to guide or inspire. It does mean informing that guidance with a detailed knowledge of the partner’s concerns — and, equally important, those of their customers — and making those concerns our first priority.

We don’t do this just to be nice (although there’s nothing wrong with that), but because it works. For starters, it puts us in a better position to execute on strategy. What’s more, this emphasis on transparency and value creation multiplies the value created, often exponentially. In a command-and-control approach, directives are issued and executed from the top down, with value created at that single point of execution. But with team members empowered to engage responsively with partners and their customers, those points of value-creation are multiplied, and marketplace successes are magnified — for everyone.

HCLSoftware’s core values of trust, transparency, and value creation are the essential reason we’re able to put partners first in this way. Our leadership is committed to providing the tools and knowledge clients need to succeed with their customers — both to our own teams and to our partners. That lets us engage our partners in new and innovative ways that will multiply their success — and our own.

New Domino REST APIs Are Now Available

8. Dezember 2022 Posted by Adam Gartenberg

We’re pleased to announce that the Domino REST APIs are now generally available for download by Domino Complete Collaboration (CCB) and Domino CEO Communications customers on active support in their Domino v12.0.2 “Danube” environment.

Known during development and beta testing as Project Keep, these contemporary REST APIs increase the number of REST APIs available to Domino deployments from nine to over 100, allowing you to extend your Domino applications with the development tools of your choice. You can broaden the audience for your Domino development without sacrificing Domino’s security or ease of deployment. And with this new REST API access to Domino servers and databases, you can securely make Domino servers available for pro-code development. Developers can expose Domino data in the form of standardized Open-API-based methods while maintaining control of which data will be made available for view or update.

How can it be used?

To give just a few examples of how the REST APIs can be used, you can apply them to:

  • Web applications: Display job openings stored in a Domino database on the company website, validating allowable postings without requiring Java or JavaScript skills.
  • Mobile Applications: Use Volt MX to create a front-end mobile app for the travel approval system run on Domino. Employees can enter their travel requests, and when managers authenticate, they will see and can complete the workflow approval.
  • Application Integration: Integrate contracts and other customer data stored in Domino with SalesForce CRM, giving sales representatives a single view of all necessary information.

What is it? 

The Domino REST APIs allow for pro-code development through secure access to HCL Domino servers and databases, allowing you to use the programming language and framework of your choice — while maintaining the reliability and deployability of your Domino data and applications. They run alongside the server and allow you to expose — securely and easily — your Domino data in the form of standardized Open-API-based methods. Using a browser-based admin UI, application owners can define which data will be made available for view or update on a REST API. The APIs extend the Domino principles of reader/author document access definitions into the world of Internet protocols.

The Domino REST APIs also includes the Swagger UI, an API explorer that allows the visualization and interaction with APIs without having any of the actual implementation logic in place. The APIs are automatically generated from an OpenAPI (formerly Swagger) specification with visual documentation, making it easier to implement the back-end code later.

What’s special about the new Domino REST APIs?

  • Secure by default, with fine granular controls per form, field, and user basis
  • Implements latest open standards
  • HTTP/2-ready, for server-to-server or client-to-server communication
  • API-first design with full interactive documentation
  • Low barrier to entry, as it runs on a Domino server and/or your Notes client
  • Admin UI and Postman samples included
  • State-of-the-art JWT access token integrated with your existing IdP infrastructure

What can be accessed via the Domino REST APIs?

You can access content like views, documents, and fields as well as agents and ACL settings. And, of course, featuring DQL queries to quickly access the data you are looking for.

Built-in declarative security ensures the API will only allow access to fields the caller is authorized to see or update. This can effectively prevent computed fields from being overwritten and limits participants in a workflow to updating their fields only.

Where to find it

The Domino REST APIs are now available as a prepacked, ready-to-use Docker container to Domino Complete Collaboration (CCB) and Domino CEO Communications customers on active support. The download is now available in the “HCL Domino” product line category on our License and Download Portal. Please visit the documentation here to learn more and to set up your test environment. Note: The REST APIs only work with Domino v12.0.2 “Danube” and higher. (Hint: If you haven’t upgraded yet, now’s the time!)

domino admin

application management

 

How to Choose the Right Database for Commerce

19. Oktober 2021 Posted by Amanda Tevis

“What is the best database for large eCommerce websites, and why?” was a question I saw on Quora. This question is important because managing data within your eCommerce website plays a pivotal role in delivering positive user experiences and providing smooth transactions. It doesn’t matter how well your eCommerce website interface is designed and how nice the products/services look. It will be worthless unless the eCommerce website has the performance to quickly retrieve, process, and deliver information.  

Your database handles organizing and managing your company’s data. And with over 340 databases on the market, finding a powerful, reliable, and secure database for eCommerce can be a challenge, and choosing the best one can be overwhelming.   

But we are here to help!   

Below, we give some helpful tips on what to look for when choosing the right database for your eCommerce website. 

Cloud Native: When evaluating databases, look for an enterprise-grade cloud native database that makes deploying, scaling, and managing data easy within a Kubernetes environment because a cloud native database offers transactional processing speed, reliability, and flexibility all in one. 

Reliability: Consider a database with high availability (HA) that provides the performance and resiliency enterprises’ need, while meeting the Recovery Time Objective (RTO) and Recovery Point Objectives (RPO) for your business. ‘High Availability’ (HA) ensures the uptime required for enterprise, mission-critical applications to ensure business success.   

Easy Deployment: Developers and agile teams want to begin development in a matter of minutes as opposed to days or weeks. Finding a database that allows for quick deployment yields significant benefits, including reduced time to value, decreased operational costs, and shortened output hours for developers. 

Versatility: A database that runs on any cloud – private, public, or hybrid–with no vendor lock-in are features to look for because its capability translates well for on-premise, edge and/or cloud and even embedded/IoT solutions. Also, finding a multi-model database that encompasses multiple disparate data models within a single, streamlined backend allows you and your company to easily combine different database model types into one integrated database engine.  

Transactional speed: Speed and performance are top considerations when choosing a database to power critical applications. A database that rapidly delivers the throughput (TPS/TPM), response time, and cost per transaction modern applications require is necessary for large eCommerce platforms. 

Ease of Use: A modern user interface to monitor and manage the database delivers business productivity and efficiency. Therefore, you must pay attention to a database’s usability from various viewpoints, including the number of active users it can support at a time, visualization, and overall ease of use. In addition, it would help if you consider ease of reporting key performance indicators (KPIs), a task scheduler, customizable alerting system, operating system, and framework compatibility. 

Cost: While scalability and transactional speed requirements are crucial considerations, you can’t overlook the total cost (TCO) of implementing a database with your eCommerce solution. Your new database should fall within your budget and align with your organization’s financial objectives and existing sales and marketing plans. 

Trial the Product: Vendors provide experts who will demo the database and perhaps build a customized demo based on your requirements. But you need to know what’s going on behind the curtain. So instead of being shown a static report, why not try to build the demo yourself.  Best thing outside of going to a class to learn about the database is requesting a free trial, so that you can work hands-on with the database. It’s a fantastic way to get an actual perspective on the product.  

Same Vendor: Now, picking an eCommerce solution from one vendor and a database from another one happens all the time and can work. But one way you can look to complement your eCommerce solution with a powerful database is to buy from the same vendor. This ensures tight integration between the products, and can deliver on innovation that customers want without being constrained by other software vendors. Plus, the same vendor carefully plans roadmaps, support, and new releases for their products to work collectively without issues. 

The list of factors to consider when choosing a database is endless. However, the above tips are some of the most valuable ones if you are looking for a place to start.  

Are you looking for a powerful, reliable, proven database to power your cloud-native application development projects? Or have questions? Contact the OneDB team, and we will assist. 

Also, check out HCL Commerce for enterprise commerce platforms that delivering powerful B2C and B2B (Business 2 Business) customer experiences.  

Upload/Download your Azure Storage files by using Azure Data Lake Storage Plugin with Workload Automation

19. Mai 2021 Posted by Shubham Chaurasia

Let us begin with understanding of Azure what it is all about before moving to our Azure Storage plugin and how it benefits our workload automation users.

“Azure is an open and flexible cloud platform that enables you to quickly build, deploy and manage applications across a global network of Microsoft-managed datacentres. You can build applications using any language, tool, or framework. And you can integrate your public cloud applications with your existing IT environment.”

Azure is incredibly flexible, and allows you to use multiple languages, frameworks, and tools to create the customised applications that you need. As a platform, it also allows you to scale applications up with unlimited servers and storage.

What is an Azure Storage Account?

The Azure Storage platform is Microsoft’s cloud storage solution for modern data storage scenarios. Core storage services offer a massively scalable object store for data objects, disk storage for Azure virtual machines (VMs), a file system service for the cloud, a messaging store for reliable messaging, and a NoSQL store.

An Azure storage account contains all your Azure Storage data objects: blobs, files, queues, tables, and disks. The storage account provides a unique namespace for your Azure Storage data that is accessible from anywhere in the world over HTTP or HTTPS. Data in your Azure storage account is durable and highly available, secure, and massively scalable.

Core storage services

The Azure Storage platform includes the following data services:

  • Azure Blobs: A massively scalable object store for text and binary data. Also includes support for big data analytics through Data Lake Storage Gen2.
  • Azure Files: Managed file shares for cloud or on-premises deployments.
  • Azure Queues: A messaging store for reliable messaging between application components.
  • Azure Tables: A NoSQL store for schemeless storage of structured data.
  • Azure Disks: Block-level storage volumes for Azure VMs.

Introduction to Azure Data Lake Storage Gen2

Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob storage. Data Lake Storage Gen2 converges the capabilities of Azure Data Lake Storage Gen1 with Azure Blob storage. For example, Data Lake Storage Gen2 provides file system semantics, file-level security, and scale. Since these capabilities are built on Blob storage, it provides low-cost, tiered storage, with high availability/disaster recovery capabilities.

Figure 1 Azure Data Lake gen2

Let us clearly understand the benefits with the following example:

Cloud computing has enabled many teams to adopt agile development methods. They need to repeatedly deploy their solutions to the cloud, and know their infrastructure is in a reliable state. As infrastructure has become part of the iterative process, the division between operations and development has disappeared. Teams need to manage infrastructure and application code through a unified process.

To meet these challenges, you can automate upload/download multiples files and use the practice of infrastructure as code.

Using Azure SPN (Service principal Name) credentials or access key user can login and can select the available container in the storage account (Azure).

Instead of using Azure portal, you can upload/download an existing file by using Azure Storage plugin with workload Automation. Using Azure SPN credentials or access key, user can login and can see all the available files in the server (Azure Storage – Data lake gen2).

Let us begin with our plugin part with job definition parameters

Azure Storage Plugin

Log in to the Dynamic Workload Console and open the Workload Designer. Choose to create a new job and select “Azure Data Lake Storage Plugin” job type in the Cloud section.

Figure 2 Job Definition

Connection Tab

Establishing connection to the Azure server:

Connection Info

Use this section to connect to the Azure server.

Subscription – The ID that uniquely identifies your subscription to Azure. This attribute is required. If not specified in the job definition, it must be supplied in the plug-in properties file.

Client – The Azure Client ID associated to your SPN account. This attribute is required. If not specified in the job definition, it must be supplied in the plug-in properties file.

Tenant – The Azure Tenant ID associated to your SPN account. This attribute is required. If not specified in the job definition, it must be supplied in the plug-in properties file.

Password (Key) – The Azure Client Secret Key associated to your SPN account. This attribute is required. If not specified in the job definition, it must be supplied in the plug-in properties file. This is also known as client key.

Account Name – The account name associated to your Azure Data Storage account.

Test Connection – Click to verify that the connection to the Azure server works correctly.

Figure 3 connection tab – SPN

OR

Access Key Authentication

Account Name – The account name associated to your Azure Data Storage account.

Access Key – Use this option to authorize access to data in your storage account.

Figure 4 Connection Tab – Access key

Action Tab

Use this section to define the operation details.

Operation

Container Name– Specify the name of the container in which the files are stored. Click the Select button to choose the container name defined in the cloud console. Select an item from the list, the selected item is displayed in the Container Name field.

Figure 5 Action Tab – Select Container

Select Operations

-Use this section to either upload or download objects.

Figure 6 Action tab – upload

Upload File – Click this radio button to upload files to the Storage Account.

Folder Location Inside Container– Enter the name of the file to be uploaded or the path of the file stored. Click the Search button to choose the file name defined in the cloud console. Select an item from the list, you can select multiple files. The selected item is displayed in the Folder Location Inside Container field.

Source File Paths – Displays the path of the source file. You can use the filter option to streamline your search.

If a file already exists– Select an appropriate option for the application to perform if the uploaded file already exists in the console.

· Replace – Selecting this option replaces the already existing file in the console.

· Skip – Selecting this option skips the upload of the selected file in the console.

Download File – Click this radio button to download files from the Storage Account.

Figure 7- Action tab – Download

Select Files– Click the Select Files button to choose the file name defined in the cloud.

Destination File Path – Provide the location to download or upload files. Click the Select button to choose the location of the source file, the selected item is displayed in the Destination File Path field.

Submitting your job

It is time to Submit your job into the current plan. You can add your job to the job stream that automates your business process flow. Select the action menu in the top-left corner of the job definition panel and click on Submit Job into Current Plan. A confirmation message is displayed, and you can switch to the Monitoring view to see what is going on.

Figure 8 Submit Job

Figure 9 Monitor Job

Figure 10 Monitor Job

Figure 11 Job Log

Figure 12 Workflow Details

Are you curious to try out the Azure Data Lake Storage plugin? Download the integrations from the Automation Hub and get started or drop a line at santhoshkumar.kumar@hcl.com.

 

Authors Bio

Shubham Chaurasia – Developer at HCL Software

Responsible for developing integration plug-ins for Workload Automation. Hands-on with different programming languages and frameworks like JAVA, JPA, Microservices, MySQL, Oracle RDBMS, AngularJS.

LinkedIn – https://www.linkedin.com/in/shubham-chaurasia-1a78b8a9/

 

Rabic Meeran K, Technical Specialist at HCL Technologies

Responsible for developing integration plug-ins for Workload Automation. Hands-on with different programing languages and frameworks like JAVA, JPA, Spring Boot, Microservices, MySQL, Oracle RDBMS, Ruby on Rails, Jenkins, Docker, AWS, C and C++.

LinkedIn – https://www.linkedin.com/in/rabic-meeran-4a828324/

 

Saket Saurav, Tester (Senior Engineer) at HCL Technologies

Responsible for performing Automation and Manual Testing for different plugins in Workload Automation using Java Unified Test Automation Framework. Hands-on experience on Java programming language, Web Services with databases like Oracle and SQL Server

LinkedIn – https://www.linkedin.com/in/saket-saurav-8892b546/