Family sues ChatGPT-maker OpenAI over school shooting in Canada

VANCOUVER, British Columbia (AP) — The parents of a girl critically wounded in a school shooting in Canada alleged in a civil lawsuit Monday that ChatGPT-maker OpenAI knew the shooter was planning a mass attack.

OpenAI has said it considered but didn’t alert police about the activities of the person who months later committed one of Canada’s worst school shootings in Tumbler Ridge, British Columbia, on Feb. 10.

OpenAI came forward to police after Jesse Van Roostselaar killed eight people and then herself last month, saying the attacker’s ChatGPT account had been closed but that she evaded the ban by having a second account.

The legal claim filed in the British Columbia Supreme Court alleged that OpenAI had “specific knowledge of the shooter utilizing ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting.”

The lawsuit said OpenAI’s chatbot ChatGPT was used by the shooter as a trusted confidante, collaborator and ally, and it behaves willingly to assist users such as the shooter to plan a mass casualty event.

A spokeswoman from OpenAI didn’t immediately respond to a message seeking comment on the lawsuit.

                        Related Stories
                    
                

        
    
    
    
    







    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            Meta to acquire Moltbook, the social network for AI agents
        

    

  

    

    
    








    

        1 MIN READ
    









  

    

    

    

    

    




                
            

    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            Did anybody do the reading? Colleges grapple with a generational shift in learning — plus AI
        

    

  

    

    
    








    

        7 MIN READ
    









  

    

    

    

    

    




                
            

    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            AI company Anthropic sues Trump administration seeking to undo ‘supply chain risk’ designation
        

    

  

    

    
    








    

        5 MIN READ

The lawsuit said that as a result of the company’s conduct Maya Gebala was shot three times at close range, with one bullet hitting her head, another her neck and the third grazing her cheek. It said she has a catastrophic brain injury that will leave her with permanent cognitive and physical disabilities.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin