“Adhan Package: Mapping Prayer Times in Helsinki, Finland”

“Adhan Package: Mapping Prayer Times in Helsinki, Finland”

[This article was first published on gacatag, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

 

The adhan package is available here !

 The prayer times cannot always be estimated accurately in some places such as countries located in higher latitudes (e.g. the Nordic countries) . as for instance during midsummer time the Fajr may be impossible to estimate or in other words it may simply not exist ! Some Muslim residents of those countries follow Prayer times of other places such as Mecca and Medina. However, day light saving times can make it complicated ! Furthermore, some align the Dhuhr prayer time of Mecca to the local Dhuhr time and measure all the remaining times based on their differences from Dhuhr in Mecca. This would also resolve complications caused by the daylight saving time. The adhan package facilitates mapping of the prayer times of two locations based on alignment over a specific time (e.g. Dhuhr).  It can also show the prayer times of a city using several methods. The package is dependent on the  Aladhan API.

 The library is available on GACATAG GitHub. It can be installed using install_github() function supported by devtools package.

devtools::install_github(“gacatag/adhan”)

The following script shows the local prayer times of Helsinki, Finland for today (1st of April 2024), measured by the “Institute of Geophysics, University of Tehran” method. A specific day and month na dyear can also be mentioned. For more info check the parameter settings by typing ?adhan::adhan.

adhan::adhan(city=”Helsinki”, country=”Finland”, method=7)
#          date           Fajr        Sunrise          Dhuhr            Asr         Sunset        Maghrib
#  “01-04-2024” “04:07 (EEST)” “06:44 (EEST)” “13:24 (EEST)” “16:50 (EEST)” “20:05 (EEST)” “20:36 (EEST)”
#          Isha          Imsak       Midnight     Firstthird      Lastthird
#”22:02 (EEST)” “03:57 (EEST)” “01:24 (EEST)” “23:38 (EEST)” “03:11 (EEST)”

Currently there are 17 methods which are supported by the Aladhan API. Defining a custom method is also possible. The following shows the Helsinki prayer times for the entire month of April 2024.

HelsinkiAdhanApr2024<- adhan::adhanMonth(
    method=”7″,
    city=”Helsinki”,
    country=”Finland”,
    month=4,
    year=2024)

The following maps the parayer times of Mecca to helsinki by aligning the Dhuhr of the two cities.

HelsinkiMeccaAdhanApr2024<- adhan::adhanMapMonth(
    method=”7″,
    city=”Helsinki”,
    country=”Finland”,
    mapCity=”Mecca”,
    mapCountry=”Saudi Arabia”,
    mapBy=”Dhuhr”,
    month=4,
    year=2024)

The tables can be organized into a nicer format using functions from the kableExtra library.

x<- kbl(HelsinkiMeccaAdhanApr2024,
    table.attr = “style=’width:100%;’” ) %>%
    kable_classic(full_width = TRUE, position = “center” )

kableExtra::save_kable(x,file=”April.pdf”)

as_image(x, file=”April.png”)

To leave a comment for the author, please follow the link and comment on their blog: gacatag.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you’re looking to post or find an R/data-science job.


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

Continue reading: adhan package: retreiving and aligning the prayer times in R

Key Insights from the Adhan Package

The revealed information about the adhan package is exciting for the Muslim community, especially the ones residing in higher latitude places. Compiled from the Aladhan API, the package can map prayer times for any two locations and show prayer times of a city through multiple methods. It looks to solve the complexities arising from daylight savings while providing a platform for accuracy and widespread adoption.

Long term implications and future developments

The adhan package’s future seems to hold immense promise as it continues to assist Muslims worldwide, especially those residing in higher latitudes, with more accurate prayer times. However, the future will still see significant advancements and feature additions for additional convenience and accuracy.

Potential advancements in convenience

In the future, users could expect to see enhanced functionality in setting tailored alerts or reminders based on accurate prayer timings, thereby avoiding the worry of missing a prayer. Additionally, integrating the package with commonly used smart home devices may also be a prospective development, making it even more convenient for users.

Improved accuracy and location coverage

While the package already boasts considerable accuracy, future updates may aim to perfect this with ongoing tweaks and adjustments. Furthermore, expanding location coverage beyond existing capabilities might even eliminate the need to use estimates, providing even the remotest participants with precise prayer timings.

Actionable Advice

For enthusiasts looking to maximize the benefits from the adhan package, here are some recommendations:

  1. Constant updates: Regularly updating the package ensures users benefit from the latest additions and improvements, maintaining accuracy and consistency.
  2. Learn the nuances: Understanding the underlying principles and functionalities of the adhan package can help users tailor it more effectively to their needs.
  3. Reach out for support: If you encounter any issues, reaching out to the broader community can be helpful. The package’s GitHub repository is also a treasure trove of valuable information and solutions.

In conclusion, the adhan package’s importance and potential for future developments is immense, making it an essential tool for Muslims globally. By understanding its functions and keeping it updated, users can ensure they reap its maximum benefits.

Read the original article

“Jumpstart Your AI Journey with Google’s Courses”

“Jumpstart Your AI Journey with Google’s Courses”

Start your AI journey today with these courses from Google.

The Future of AI: Implications and Developments from Google AI Courses

The field of artificial intelligence (AI) has seen a rapid and revolutionary evolution in the past decade, with Google leading the way in both research and education. Their AI courses offer a comprehensive and accessible journey into this exciting field, enabling you to stay ahead in your professional or personal goals.

Long-term Implications of AI

Artificial Intelligence and machine learning are no longer just science fiction—they are deeply integrated into our everyday life and will continue to shape our future. Let’s explore the potential long-term implications of the rise in AI and machine learning:

  1. Productivity Enhancements: AI technologies can automate routine tasks, allowing individuals and companies to focus on more complex and creative tasks. This could dramatically increase global productivity and economic growth in the long run.
  2. Establishment of New Industries: As technology advances, it is likely that entirely new industries and job roles will be established, centred around AI and machine learning. Professionals who understand and can apply these technologies effectively will have a competitive advantage in these emerging sectors.
  3. Greater Personalization: With the increase in AI and machine learning algorithms, there will be the capacity for more personalization—which means everything from online shopping experiences to education could become more tailored and efficient.
  4. Ethical and Privacy Challenges: As AI technologies become more prevalent, they also raise significant ethical and privacy issues that society will need to address. This will include everything from developing robust AI ethics guidelines to potentially creating new policy and regulations.

Future Developments in AI

Based on the insights derived from Google’s AI courses, here are some probable future AI developments:

  • AI and Healthcare: AI has significant potential to transform healthcare services, from diagnostic tools to patient care and management systems. This could revolutionize the way we access and provide health care globally.
  • AI in Education: With the help of AI, we could see more personalized and adaptive learning experiences. AI could also help in administrative tasks, thereby allowing education providers to concentrate more on teaching.

Actionable Recommendations

Given these implications and potential future developments, here are some actionable recommendations to make the most of the AI revolution:

  1. Enroll in the AI courses offered by Google: By developing a deep understanding of AI and machine learning, you can keep your skills current and relevant. The Google AI courses are an excellent place to start.
  2. Prepare for New Opportunities: Stay aware of emerging AI technologies and track new job roles and industries that may develop. This could help you to position yourself for future career progression.
  3. Promote Ethical AI use: Become an advocate for ethical AI use in your professional and personal networks. This could include promoting awareness of AI ethics guidelines and campaigns for responsible AI regulation.

Artificial Intelligence and machine learning hold tremendous transformative potential. As we tap into this potential and navigate its challenges, education will be our key tool. Google’s AI courses provide a robust starting point for this journey.

Read the original article

A “mix-in” is a component or feature added to an existing system or product to enhance its functionality, performance, or complexity without altering its core structure, akin to adding toppings to a dessert to enrich its flavor and appeal. Recently, a customer mentioned their plans to implement Generative AI (GenAI) for predictive maintenance. Maybe I’m… Read More »How to Transform Your ML Models with Generative AI

Generative AI: A New Horizon in Predictive Maintenance

Generative AI (GenAI) has rapidly attracted the attention of innovators worldwide. Particularly in industries heavily reliant on machinery, such as manufacturing, GenAI can enhance the predictive maintenance of systems. However, like any other technological advancement, it raises its unique array of opportunities and challenges.

Long-Term Implications of Generative AI

When it comes to the long-term implications of implementing generative AI in predictive maintenance, there’s a lot to consider. Firstly, increased efficiency and reduced downtime are notable potentials as the AI software can predict maintenance schedules accurately.

Over time, as the AI learns from data and evolves, the precision of its predictions is set to improve. This development can result in potential cost-saving and efficiency benefits for businesses, particularly those in manufacturing, logistics, and other industries where equipment maintenance is critical.

However, the transition to a predictive maintenance model could also lead to job displacement. As AI takes more care of monitoring and predictions, certain traditional roles may become redundant. Businesses would need to tackle this socioeconomic issue while retraining and upskilling employees for different positions.

Potential Future Developments

As we look to the possible future developments of using genAI for predictive maintenance, there are a few key trends to pinpoint. As the technology matures, we could experience a more comprehensive digital transformation with genAI interweaving through various systems and processes.

Furthermore, there may be advancements in how the models are trained, with richer data sets leading to even more accurate predictions than previously. An increasing ability to predict potential issues in advance can help prevent costly downtimes.

Actionable Advice: Transforming Your ML Models with Generative AI

Considering the possible opportunities and challenges of implementing GenAI, businesses need to chart their course wisely. Here are some recommendations:

  1. Start small: Rather than making a complete switch to GenAI for predictive maintenance, start by testing it on a smaller scale. This allows for controlled experimentation and learning about your system’s responses.
  2. Upskill the workforce: Simultaneously, invest in reskilling your team to prepare them for the new technology and address potential job displacement.
  3. Collaborate with AI experts: If you don’t already have them on your team, you may need grounded tech professionals who understand the nuances of AI.
  4. Stay updated: Like every other technology, GenAI also keeps evolving. Regular updating of your system and technology is crucial.

While the journey to incorporating GenAI might seem arduous initially, the rewards that wait at the end could be well worth the investment in time, money, and effort.

Read the original article

Scraping PDF Text and Summarizing with OpenAI in R

Scraping PDF Text and Summarizing with OpenAI in R

[This article was first published on business-science.io, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

Hey guys, welcome back to my R-tips newsletter. Businesses are sitting on a mountain of unstructured data. The biggest culprit is PDF Documents. Today, I’m going to share how to PDF Scrape text and use OpenAI’s Large Language Models (LLMs) to summarize it in R.

Table of Contents

Here’s what you’re learning today:

  • How to scrape PDF Documents I’ll explain how to scrape the text from your business’s PDF Documents using pdftools.
  • How I summarize PDF’s using the OpenAI LLMs in R. This will blow your mind.

XGBoost R Code

Get the Code (In the R-Tip 078 Folder)


SPECIAL ANNOUNCEMENT: ChatGPT for Data Scientists Workshop on April 24th

Inside the workshop I’ll share how I built a Machine Learning Powered Production Shiny App with ChatGPT (extends this data analysis to an insane production app):

ChatGPT for Data Scientists

What: ChatGPT for Data Scientists

When: Wednesday April 24th, 2pm EST

How It Will Help You: Whether you are new to data science or are an expert, ChatGPT is changing the game. There’s a ton of hype. But how can ChatGPT actually help you become a better data scientist and help you stand out in your career? I’ll show you inside my free chatgpt for data scientists workshop.

Price: Does Free sound good?

How To Join: 👉 Register Here


R-Tips Weekly

This article is part of R-Tips Weekly, a weekly video tutorial that shows you step-by-step how to do common R coding tasks. Pretty cool, right?

Here are the links to get set up. 👇

Businesses are Sitting on $1,000,000 of Dollars of Unstructured Data (and they don’t know how to use it)

Fact: 90% of businesses are not using their unstructured data. It’s true. Many companies have no clue how to extract it. And once they extract it, they have no clue how to use it.

We’re going to solve both problems in this R-Tip.

The most common form is text located in PDF documents.

Businesses have 100,000s of PDF documents that contain valuable information.

PDF Data

OpenAI Document Summarization

One of the best use cases of LLMs is document summarization. But how do we get PDF data to OpenAI?

One easy way is in R!

R Tutorial: Scrape PDF Documents and Summarize with OpenAI

This is a simple 2 step process we’ll cover today:

  1. Extract PDF Text: We’ll use pdftools to extract text
  2. Summarize Text with OpenAI’s LLMs: We’ll use httr to connect to OpenAI’s API and summarize our PDF document

Business Objective:

I have set up a PDF document of Meta’s 2024 10K Financial Statement. We’ll use this document to analyze the risks that Meta reported in their filing (without even reading the document).

This is a massive speed up – and I can ask even more questions too beyond just the risks to really understand Meta’s business.

Good questions to ask for this financial case study:

  1. What are the top 3 risks to Meta’s business
  2. Where does Meta gain most of it’s revenue?
  3. In which business line is Meta’s revenue growing the most?

PDF Data

Get the PDF and Code

You can get the PDF and Code by joining the R-Tips Newsletter here.

T-Tip 078 Folder

Get the PDF and Code (In the R-Tip 078 Folder)

Load the Libraries

Next, load the libraries. Here’s what we’re using today:

Load Libraries

Get the PDF and Code (In the R-Tip 078 Folder)

Step 1: Extract PDF Text

With our project set up and libraries loaded, next I’m extracting the PDF text. It’s very easy to do in 1 line of code with pdftools::pdf_text().

Extract PDF Text

Get the PDF and Code (In the R-Tip 078 Folder)

This returns a list of text for 147 pages in Meta’s 10K Financial Statement. You can see the text on each page by cycling through text[1], text[2] and so on.

Step 2: Summarize the PDF Document with OpenAI LLMs

A common task: I want to know what risks Meta has identified in their 10K Financial Statement. This is required by the SEC. But, I don’t want to have to dig through the document.

The solution is to use OpenAI to summarize the document.

We will just summarize the first 30,000 characters in the document. There are more advanced ways to create a vector storage, but I’ll save that for a follow up post.

Run this code to set up OpenAI and our prompt:

Note that I have my OpenAI API key set up. I’m not going to dive into all of that. OpenAI has great documentation to set it up.

OpenAI Prompt Set Up

Get the PDF and Code (In the R-Tip 078 Folder)

Run this code to send the text and get OpenAI’s response

I’m using httr to send a POST request to OpenAI’s API. Then OpenAI provides a response with the answer to my question in the context of the text I provided it.

Connect to OpenAI API

Get the PDF and Code (In the R-Tip 078 Folder)

Run this Code to Parse the OpenAI Response

In just a couple seconds, I have a response from OpenAI’s API. Run this code to parse the response.

Parse OpenAI API Resposne

Get the PDF and Code (In the R-Tip 078 Folder)

Review the Response

Last, we can review the response from OpenAI’s Chat API. We can see that the top 3 risks are:

  1. Regulatory Compliance
  2. User Privacy and Trust Issues
  3. Competition and Innovation Risks

OpenAI Chat API Response

Conclusions:

You’ve learned my secret 2 step process for PDF Scraping documents and using LLM’s like OpenAI’s Chat API to summarize text data in R. But there’s a lot more to becoming an elite data scientist.

If you are struggling to become a Data Scientist for Business, then please read on…

Struggling to become a data scientist?

You know the feeling. Being unhappy with your current job.

Promotions aren’t happening. You’re stuck. Feeling Hopeless. Confused…

And you’re praying that the next job interview will go better than the last 12…

… But you know it won’t. Not unless you take control of your career.

The good news is…

I Can Help You Speed It Up.

I’ve helped 6,107+ students learn data science for business from an elite business consultant’s perspective.

I’ve worked with Fortune 500 companies like S&P Global, Apple, MRM McCann, and more.

And I built a training program that gets my students life-changing data science careers (don’t believe me? see my testimonials here):

6-Figure Data Science Job at CVS Health ($125K)

Senior VP Of Analytics At JP Morgan ($200K)

50%+ Raises & Promotions ($150K)

Lead Data Scientist at Northwestern Mutual ($175K)

2X-ed Salary (From $60K to $120K)

2 Competing ML Job Offers ($150K)

Promotion to Lead Data Scientist ($175K)

Data Scientist Job at Verizon ($125K+)

Data Scientist Job at CitiBank ($100K + Bonus)

Whenever you are ready, here’s the system they are taking:

Here’s the system that has gotten aspiring data scientists, career transitioners, and life long learners data science jobs and promotions…

What They're Doing - 5 Course R-Track


Join My 5-Course R-Track Program Now!
(And Become The Data Scientist You Were Meant To Be…)

P.S. – Samantha landed her NEW Data Science R Developer job at CVS Health (Fortune 500). This could be you.

Success Samantha Got The Job

To leave a comment for the author, please follow the link and comment on their blog: business-science.io.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you’re looking to post or find an R/data-science job.


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

Continue reading: How to Scrape PDF Text and Summarize It with OpenAI LLMs (in R)

Impact of Unstructured Data Extraction and Summarization Techniques

Businesses today are sitting on a gold mine of unstructured data, primarily in the form of PDF documents. However, a large majority struggle in extracting and making meaningful use of this data. Techniques such as OpenAI’s Large Language Models (LLMs) for summarizing PDF data in R have opened new avenues to counter this challenge. Going forward, the value of this wealth of unstructured data can be unleashed with better applications of these techniques.

Future Developments

The current trend points towards a future where businesses will rely more on automated data extraction and summarization tools. Potentially, these techniques can revolutionize how businesses handle large volumes of unstructured information. It can lead to faster decision-making processes and improved understanding of critical business aspects such as risk management.

Automated Risk Analysis

For instance, businesses can implement LLMs to conduct automated financial risk analysis. By analyzing the risks identified by companies in their 10K Financial Statements, these models can provide summaries of top risks, revenue sources, and fastest-growing business lines, thereby enhancing strategic decision-making. As more businesses incorporate this technology, newer applications will surface creating a ripple effect in the industry.

Actionable Advice

Considering these long-term implications and future developments, it is advisable for businesses to invest in technologies and skills relating to data extraction and summarization using techniques like pdftools and OpenAI’s LLMs. This will not only reveal the hidden value in their unstructured data but also enhance their competitiveness in the market.

For Businesses

  1. Invest in Training: Organizations should consider training their teams in data extraction and summarization techniques. This will help to unlock the potential in their unstructured PDF data.
  2. Adopt Automation: With advancements in data extraction and summarization tools, it is important to integrate these into the workflow for efficient data management.

For Individuals

  1. Learn R: As the tutorial suggests, learning R, and in particular the application of OpenAI’s LLMs and pdftools in R, can be a valuable asset for anybody dealing with unstructured data.
  2. Adopt a Data Scientist Mindset: It is crucial to approach these tools from the perspective of a data scientist. By asking the right questions, you can make the most out of the unstructured data at your disposal.

Read the original article

“Master Data Science with Free Courses: The Nine Steps to Job Readiness”

“Master Data Science with Free Courses: The Nine Steps to Job Readiness”

Learn everything about data science by exploring our curated collection of free courses from top universities, covering essential topics from math and programming to machine learning, and mastering the nine steps to become a job-ready data scientist.

Understanding the Future Implications and Developments in Data Science

The field of data science is continually expanding and evolving, providing endless opportunities for learning and growth. This article will explore the potential future developments in data science and what these could mean for individuals aspiring to become data scientists.

Long-term Implications

Advancements in data science capabilities mean the role and value of data scientists will continue to grow in the foreseeable future. With the proliferation of digital data, organizations increasingly require experts who can glean insights from this vast wealth of information.

As more businesses look to capitalize on their data, those with a mastery of data science will be in high demand. From identifying trends and patterns to predictive modeling, data scientists are integral in guiding business decision-making.

Potential Future Developments

Looking ahead, the field of data science promises several exciting developments. For one, machine learning is expected to play a larger role in data analysis. As algorithms become increasingly sophisticated, they will enable more accurate predictions and insights.

Secondly, the availability of extensive free online resources, including courses from top universities, is democratizing access to data science knowledge. This will continue to shape the sector, opening doors to a more diverse range of people and perspectives.

Actionable Advice

Mastery the Essentials

The first thing to remember is that getting a solid foundation in the basics is crucial. This includes math, programming and understanding the nine steps of becoming a job-ready data scientist. Make sure to take advantage of the free courses from top universities.

  • Math Skills: Focus on statistics and probability, calculus and linear algebra. These are the building blocks of data science.
  • Programming Skills: Learning to code is a fundamental skill in this field. Python and R are the most commonly used languages in data science.
  • The Nine Steps: From business understanding, data understanding, data preparation, modeling, evaluation, to deployment, mastering these nine steps is paramount to becoming job-ready in the data science field.

Stay Ahead of Trends

Keeping up with emerging trends is imperative in a fast-moving field like data science. Continuously refreshing your skills and knowledge will help keep your work relevant and in-demand.

Conclusion

The age of big data brings a new era of career opportunities. Becoming proficient in data science not only opens a world of job prospects, but it also allows for a unique capacity to interpret and understand the world around us. By mastering the basics and keeping an eye on future developments, you can develop a valuable skill set that will be in demand for many years to come.

Read the original article

The new breed of Large Language Models: why and how it will replace OpenAI and the likes, and why the current startup funding model is flawed

The Future of Large Language Models and Technology Startups

With the advent of new and improved large language models, the AI industry is set to experience a major shake-up. Predicted to render existing models like OpenAI obsolete, these models are expected to bring radical transformations in various sectors. Coupled with the changing landscape is the current startup funding model, which many argue as flawed. It is crucial to explore these changes and what they might mean for the industry long-term.

The New Breed of Large Language Models

Experts argue that the new breed of large language models will have the capability to replace the likes of OpenAI and provide more sophisticated AI solutions. The advanced systems are designed to offer higher efficiency, precision, and adaptability with improved cognitive capabilities. The speculation concerning the replacement stems from the potential limitations of OpenAI and its kind in meeting future technology demands.

Implications for AI Industry

This new change could have significant implications for the AI industry. With superior techniques, the new models could potentially shape the AI landscape by setting new standards for machine learning. While earlier models like OpenAI have paved the way in natural language processing, these newer models may elevate AI’s capabilities, encouraging more businesses to adopt AI solutions.

Future Developments

Given the potential of this new breed of large language models, future developments may evolve around enhancing their efficiency and adaptability to a wider range of sectors. The integration of these models in various business areas, from customer service to security, could redefine the way AI is used in our everyday lives. It is also likely that further research and development would focus on overcoming any limitations these new models may present.

The Startup Funding Model

Equally important to consider are the discussions surrounding the current startup funding model, which many perceive as flawed. Critics argue that the model pushes startups to show growth in terms of quantity over quality, leading to unsound business models and unrealistic expectations.

Long-term Implications

As more startups embrace the current funding model, there might be a surge in businesses that lack long-term sustainability, unable to deliver promised growth. This could result in significant economic consequences, including job losses and market instability.

Future Reforms

In response, one can expect future reforms to address these issues within the startup funding model. This could involve legislative changes pushing for more transparency, requiring startups to present a sound, realistic, and sustainable business model before securing funding. Alternatively, investors themselves might start prioritizing businesses that demonstrate sustainability over rapid but unstable growth.

Actionable Insights

  • Early Adoption: Businesses should consider exploring the potential of this new breed of large language models. Early adoption could provide a competitive edge.
  • Sound Business Planning: Startups must focus on creating sound business plans that prioritize quality growth and sustainability over rapid growth to attract discerning investors.
  • Vigilance: Investors should be vigilant about the startups they fund. Ensuring the business they invest in shows signs of long-term sustainability could safeguard their investments.