- Clarity
- Posts
- 02. Clarity
02. Clarity
AI News and Analysis for Australian Businesses
Top stories
OpenAI have released o1 which can do reasoning and “think” to break down a process into is constituent parts promising significant advances in their model. OpenAI.
Governments are seeking to balance innovation and safety with OpenAI (ChatGPT) and Anthropic (Claude) providing early access to the US AI Safety Institute, testing them for safety and ethical concerns. AI Safety Institute.
Continuing on the regulation front, the European Union AI Act has been passed and there is a case being made that Australia may follow suit. MinterEllison.
The Australian Government released a "Voluntary AI Safety Standard”, “Policy for the Responsible use of AI in Government” and “Proposal for High-Risk AI Guardrails” all foreshadowing legislation. Minister for Industry and Science.
Enterprise tools like Salesforce and Hubspot have doubled down on their AI Strategy and Apple have released Apple Intelligence coming with iPhone 16, while Moveworks have surpassed $100M in revenue.
Victoria’s Department of Families, Fairness and Housing will block the use of AI tools after a child protection worker used ChatGPT to file a report submitted to the children’s court. IT News.
Analysis
Over the past few weeks, foundation models like ChatGPT and Claude have advanced significantly, particularly with improved reasoning capabilities that allow models like ChatGPT o1 to deconstruct problems into manageable components. These models form the backbone of many AI-driven consumer and enterprise applications, and their enhanced capabilities are rapidly being integrated across industries. Large enterprises, especially those serving both consumer and business markets, are intensifying their AI strategies, making AI tools more prevalent and accessible. As a result, businesses of all sizes stand to benefit from incorporating AI into their operations, ensuring that employees are proficient in using these tools safely and effectively.
Governments, meanwhile, continue to grapple with how best to regulate AI to ensure its safe and responsible adoption. A recent example is the Victorian Government Department's decision to implement a blanket ban on AI, a move that reflects a failure to establish proper governance frameworks and training protocols. This decision is likely to have long-term negative consequences by hindering innovation and adoption, instead of mitigating risks through more nuanced policy approaches.
One of the key distinctions of AI is its accessibility. Unlike traditional technologies that were typically deployed and managed by centralised IT teams, AI can be adopted at all levels of an organisation. This democratisation of technology increases the need for organisations to adopt a consistent, standardised approach to AI use, with a strong emphasis on training and enablement.
Prompt of the Month
Specificity
General and open ended questions like “tell me about AI regulation” can generate rambling responses characterised by long lists and past, present or proposed regulation. Specificity helps get closer to what you need and is a good example of AI and people working together. This prompt could be focused by asking, “tell me about current and proposed AI regulation in Australia, specifically NSW. Keep the answer brief and include regulation that effects small business.” This will keep the AI on track and you can always prompt it again to go deeper on a specific issue.
Spotlight
Many companies we work struggle to return estimates, RFPs, bids, in a timely manner due to the large burden of work that these can take. AI tools are perfectly placed to help improve this process. For example, you can build a profile based on all previous bids you’ve completed and simply instruct your AI tool to populate the new bid based on previous responses. Its important to ensure that your AI isn’t training on your data and that this data is held privately as it would be proprietary and its also important to prompt the AI to not answer a response where it doesn’t have a high degree of confidence to allow gaps for customisation. But for those responses that are standard across many bids, this can be a massive time saver.